Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
15 - 25 Lacs
Pune
Hybrid
So, what’s the role all about? We are seeking a highly skilled Backend Software Engineer to join the GenAI Solutions for CX , our fully integrated AI cloud customer experience platform. On this role you will get the exposure to new and exciting technologies and collaborate with professional engineers, architects, and product managers to create NICE’s advanced line of AI cloud products How will you make an impact? Design and implement high-performance microservices using AWS cloud technologies Build scalable backend systems using Python Lead the development of event-driven architectures utilizing Kafka and AWS Firehose Integrate with Athena , DynamoDB , S3 , and other AWS services to deliver end-to-end solutions Ensure high-quality deliverables with testable, reusable, and production-ready code Collaborate within an agile team, influencing architecture, design, and technology adoption Have you got what it takes? 5 + years of backend software development experience Strong expertise in Python /C# Deep knowledge of microservices architecture , RESTful APIs , and cloud-native development Hands -on experience with AWS Lambda , S3 , Athena , Kinesis Firehose , and Kafka Strong database skills (SQL & NoSQL), including schema design and performance tuning Experience designing scalable systems and delivering enterprise-grade software Comfortable working with CI/CD pipelines and DevOps practices Passion for clean code, best practices, and continuous improvement Excellent communication and collaboration abilities Fluent in English (written and spoken) What’s in it for you? Join an ever-growing, market-disrupting global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7981 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 6 days ago
4.0 - 9.0 years
4 - 8 Lacs
Pune
Work from Office
Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 6 days ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Company Description Epsilon is an all-encompassing global marketing innovator, supporting 15 of the top 20 global brands. We provide unrivaled data intelligence and customer insights, world-class technology including loyalty, email and CRM platforms and data-driven creative, activation and execution. Epsilon's digital media arm, Conversant, is a leader in personalized digital advertising and insights through its proprietary technology and trove of consumer marketing data, delivering digital marketing with unprecedented scale, accuracy and reach through personalized media programs and through CJ Affiliate by Conversant, one of the world's largest affiliate marketing networks. Together, we bring personalized marketing to consumers across offline and online channels, at moments of interest, that help drive business growth for brands. Recognized by Ad Age as the #1 World's Largest CRM/Direct Marketing Agency Network, #1 Largest U.S. Agency from All Disciplines, #1 Largest U.S. CRM/Direct Marketing Agency Network and #1 Largest U.S. Mobile Marketing Agency, Epsilon employs over 8,000 associates in 70 offices worldwide. Epsilon is part of Alliance Data, a Fortune 500's and Fortune 100 Best Places to Work For a company. For more information, visit www.epsilon.com and follow us on Twitter @EpsilonMktg. Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Why we are looking for you We are looking for Senior Software Engineer to work on groundbreaking multichannel SaaS Digital Marketing Platform that focuses on uniquely identify the customer's patterns, effectively interact with them across channels and achieve a positive return on marketing investment (ROMI). The platform helps consolidate and integrates the features and functionality typically found in stand-alone services and channel-specific messaging platforms to give marketers a tightly integrated, easily orchestrated, insights-driven, cross channel marketing capability. Primary role of the Senior Software Engineer is to envision and build internet scale services on Cloud using Java and distributed technologies with 60-40 involvement in backend development with Java and frontend development using Angular. Responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions What you will enjoy in this role Tech Stack: Our integrated suite of modular products is designed to help deliver personalized experiences and drive meaningful outcomes. Our tech stack caters to a fusion of data and technology with SaaS offerings developed as a Cloud-first approach. Here, a solid understanding of software security practices including user authentication and authorization and being data-savvy would be key. You should also come with the ability to leverage best practices in design patterns, and design algorithms for software development that focus on high quality and agility. You must also have a good understanding of Agile Methodologies like SCRUM. You can refer this article also. What you will do Be responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions Qualifications Bachelor's degree or equivalent in computer science 6+ years of experience in Java/Angular/SQL/ AWS/Microservices Preferred knowledge/experience in the following technologies 2 + years of UI Technologies like Angular 2 or > 1 + year of experience in Cloud computing like AWS or Azure or GCP or PCF or OCI Experience in following Tools: Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence Can develop SOA services and good knowledge of REST API and Micro service architectures Solid knowledge of web architectural and design patterns Understands software security practices including user authentication and authorization, data validation and an understanding of common DOS and SQL injection techniques. Familiar with profiling, code coverage, logging, common IDE's and other development tools. Familiar with Agile Methodologies SCRUM and Strong communication skills (verbal and written) Ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks Ability to identify non-obvious solutions to complex problems
Posted 1 week ago
9.0 - 13.0 years
0 Lacs
navi mumbai, maharashtra
On-site
The structured finance analytics team at Morningstar DBRS in Mumbai is seeking a Manager who will lead a team of Quant Analysts in automating data analysis processes, developing data analytics, and enhancing workflow optimization tools to support the rating, research, and surveillance process. The ideal candidate should have a strong understanding of Structured Finance products (RMBS, ABS, CMBS) and possess technical skills in Python, Tableau, AWS, Athena, SQL, and VBA. This role requires expertise in managing a team, fostering a collaborative work environment, and prioritizing work schedules to ensure timely and quality delivery. The Manager will also be responsible for maintaining communication with global teams, transforming data, implementing quick fix solutions, and ensuring compliance with regulatory and company policies. The successful candidate should have 9-11 years of experience in Credit Modeling/Model Validation roles, hold a qualification in MBA (Finance)/BTech/PHD (Math) from a Tier I college, and demonstrate strong analytical skills with experience in working with large databases/datasets. Proficiency in Python, Tableau, Microsoft Excel, MSSQL, and familiarity with AWS infrastructure will be essential for this role. The Manager should be highly organized, efficient, and capable of multitasking to meet tight deadlines while ensuring high-quality deliverables. Morningstar DBRS is an equal opportunity employer committed to empowering investor success and driving innovation in the credit ratings industry.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
The ideal candidate for this position should have previous experience in building data science/algorithms based products, which would be a significant advantage. Experience in handling healthcare data is also desired. An educational qualification of Bachelors/Masters in computer science/Data Science or related subjects from a reputable institution is required. With a typical experience of 7-9 years in the industry, the candidate should have a strong background in developing data science models and solutions. The ability to quickly adapt to new programming languages, technologies, and frameworks is essential. A deep understanding of data structures and algorithms is necessary. The candidate should also have a proven track record of implementing end-to-end data science modeling projects and providing guidance and thought leadership to the team. Experience in a consulting environment with a hands-on attitude is preferred. As a Data Science Lead, the primary responsibility will be to lead a team of analysts, data scientists, and engineers to deliver end-to-end solutions for pharmaceutical clients. The candidate is expected to participate in client proposal discussions with senior stakeholders and provide technical thought leadership. Expertise in all phases of model development, including exploratory data analysis, hypothesis testing, feature creation, dimension reduction, model training, selection, validation, and deployment, is required. A deep understanding of statistical and machine learning methods such as logistic regression, SVM, decision tree, random forest, neural network, and regression is essential. Mathematical knowledge of correlation/causation, classification, recommenders, probability, stochastic processes, NLP, and their practical implementation to solve business problems is necessary. The candidate should also be able to implement ML models in an optimized and sustainable framework and gain business understanding in the healthcare domain to develop relevant analytics use cases. In terms of technical skills, the candidate should have expert-level proficiency in programming languages like Python/SQL, along with working knowledge of relational SQL and NoSQL databases such as Postgres and Redshift. Extensive knowledge of predictive and machine learning models, NLP techniques, deep learning, and unsupervised learning is required. Familiarity with data structures, pre-processing, feature engineering, sampling techniques, and statistical analysis is important. Exposure to open-source tools, cloud platforms like AWS and Azure, and AI tools like LLM models and visualization tools like Tableau and PowerBI is preferred. If you do not meet every job requirement, the company encourages candidates to apply anyway, as they are dedicated to building a diverse, inclusive, and authentic workplace. Your excitement for the role and potential fit may make you the right candidate for this position or others within the company.,
Posted 1 week ago
9.0 - 12.0 years
14 - 24 Lacs
Gurugram
Remote
We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines with a strong focus on time series forecasting and upsert-ready architectures. This role requires end-to-end ownership of the data lifecycle, from ingestion to partitioning, versioning, and BI delivery. The ideal candidate must be highly proficient in AWS data services, PySpark, versioned storage formats like Apache Hudi/Iceberg, and must understand the nuances of data quality and observability in large-scale analytics systems. Role & responsibilities Design and implement data lake zoning (Raw Clean Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-friendly ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modelling. Optimize Athena datasets with partitioning, CTAS queries, and metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate robust data quality checks using custom logs, AWS CloudWatch, or other DQ tooling. Design and manage a forecast feature registry with metrics versioning and traceability. Collaborate with BI and business teams to finalize schema design and deliverables for dashboard consumption. Preferred candidate profile 9-12 years of experience in data engineering. Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and partition strategies. Working knowledge of Apache Hudi, Iceberg, or Delta Lake for versioned ingestion. Experience in S3 metadata tagging and scalable data lake design patterns. Expertise in feature engineering and forecasting dataset preparation (lags, trends, windows). Proficiency in Git-based workflows (Bitbucket), CI/CD, and deployment automation. Strong understanding of time series KPIs, such as revenue forecasts, occupancy trends, or demand volatility. Data observability best practices including field-level logging, anomaly alerts, and classification tagging. Experience with statistical forecasting frameworks such as Prophet, GluonTS, or related libraries. Familiarity with Superset or Streamlit for QA visualization and UAT reporting. Understanding of macroeconomic datasets (USDA, Circana) and third-party data ingestion. Independent, critical thinker with the ability to design for scale and evolving business logic. Strong communication and collaboration with BI, QA, and business stakeholders. High attention to detail in ensuring data accuracy, quality, and documentation. Comfortable interpreting business-level KPIs and transforming them into technical pipelines.
Posted 1 week ago
8.0 - 13.0 years
32 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Lead AWS Data Engineer with team handling exp: Skills AWS, Python, Sql, Spark, Airflow, Athena, Api Integration, Notice Period-imm to 15days, location: Bangalore/Hyderabad/Coimbatore & Chennai
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
navi mumbai, maharashtra
On-site
Seekify Global is looking for an experienced and driven Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a strong background in designing and implementing metadata and data catalog solutions specifically in AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer, you will play a crucial role in improving data discoverability, governance, and lineage across the organization's data assets. Your key responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for diverse data assets, integrating the data catalog with AWS-based storage solutions, collaborating with various project teams to define metadata standards and processes, developing automation scripts for metadata management, working closely with other data professionals to ensure data accuracy, and implementing access controls to comply with data privacy standards. The ideal candidate should possess at least 7-8 years of experience in data engineering or metadata management roles, with proven expertise in implementing data catalog solutions within AWS environments. Strong knowledge of AWS services such as Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation is essential. Proficiency in Python, SQL, and automation scripting for metadata pipelines is required, along with familiarity with data governance and compliance standards. Experience with BI tools and third-party catalog tools is a plus. Preferred qualifications include AWS certifications, experience with data catalog tools like Alation, Collibra, or Informatica EDC, exposure to data quality frameworks, stewardship practices, and knowledge of data migration processes. This is a full-time position that requires in-person work.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You may be assessed on key critical skills relevant for success in the role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology, as well as job-specific skill sets. To be successful as a Cloud Data Engineer, you should have experience with: - Experience on AWS Cloud technology for data processing and a good understanding of AWS architecture. - Experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. - Experience with Storage and container services like ECS, S3, DynamoDB, RDS. - Experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. - Experience with Analytics services such as Glue, Athena, Crawler, Lake Formation, Redshift. - Experience with Solution delivery for data processing components in larger End to End projects. Desirable skill sets/good to have: - AWS Certified professional. - Experience in Data Processing on Databricks and unity catalog. - Ability to drive projects technically with right first deliveries within schedule and budget. - Ability to collaborate across teams to deliver complex systems and components and manage stakeholders" expectations well. - Understanding of different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. - Experienced with planning, estimating, organizing, and working on multiple projects. This role will be based out of Pune. Purpose of the role: To build and maintain systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architecture pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage appropriate data volumes and velocity and adhere to required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Takes responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision making within own area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver your work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. - Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. - Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. - Guide and persuade team members and communicate complex/sensitive information. - Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
surat, gujarat
On-site
At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,
Posted 1 week ago
6.0 - 10.0 years
9 - 14 Lacs
Pune
Work from Office
Role & responsibilities Design, build, and maintain scalable data pipelines on AWS using services like Glue, Lambda, EMR, S3, Redshift, and Athena Develop and optimize ETL/ELT processes for data ingestion, transformation, and loading from diverse sources Collaborate with data analysts, scientists, and other stakeholders to understand data needs and deliver solutions Implement data quality checks and monitoring solutions to ensure integrity and reliability Work with structured and unstructured data from cloud and on-premise sources Develop and manage infrastructure as code using tools like Terraform or CloudFormation Maintain data documentation and metadata cataloging using tools like AWS Glue Data Catalog Ensure best practices in security, scalability, and performance in all data solutions Troubleshoot performance and data integrity issues Preferred candidate profile
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Dynatrace Developer/Consultant, you will be responsible for setting up and maintaining monitoring systems to track the health and performance of data pipelines. Your role will involve configuring alerts and notifications to promptly identify and respond to issues or anomalies in data pipelines. You will develop procedures and playbooks for incident response and resolution, collaborating with data engineers to optimize data flows and processing. Your experience in working with data, ETL, Data warehouse & BI Projects will be invaluable as you continuously monitor and analyze pipeline performance to identify bottlenecks and areas for improvement. Implementing logging mechanisms and error handling strategies will be crucial to capture and analyze pipeline features for quick detection and troubleshooting. Working closely with data engineers and data analysts, you will monitor data quality metrics, delete data anomalies, and develop processes to address data quality issues. Forecasting resource requirements based on data growth and usage patterns will ensure that pipelines can handle increasing data volumes without performance degradation. Developing and maintaining dashboards and reports to visualize key pipeline performance metrics will provide stakeholders with insights into system health and data flow. Automating monitoring tasks and developing tools for streamlined management and observability of data pipelines will be part of your responsibilities. Ensuring data pipeline observability aligns with security and compliance standards, such as data privacy regulations and industry best practices, will be crucial. You will document monitoring processes, best practices, and system configurations, sharing knowledge with team members to improve overall data pipeline reliability and efficiency. Collaborating with cross-functional teams, including data engineers, data scientists, and IT operations, you will troubleshoot issues and implement improvements. Keeping abreast of the latest developments in data pipeline monitoring and observability technologies and practices will enable you to recommend and implement advancements. Knowledge in AWS Glue, S3, Athena is a nice-to-have, along with experience in JIRA and knowledge in any programming language such as Python, Java, or Scala. This is a full-time position with a Monday to Friday schedule and in-person work location.,
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Bharatpur
Work from Office
Who are you 2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools.
Posted 1 week ago
0.0 - 2.0 years
6 - 8 Lacs
Pune
Work from Office
Job Title: Systems Engineer Location : Pune, India About This Role : The Command Center team is a group of our Engineers that keep Comscore’s Products and Infrastructure running by tracking and resolving issues, supporting systems, and responding to client queries. We’re responsible for supporting the servers, cloud services, network, and storage devices across all of Comscore’s data centers and cloud environments and responding to both internal and client queries 24/7. The Command Center team is hiring a System Engineer – Command Center to provide 24x7 support. This role requires someone with strong technical and communications skills in order to be effective. This team will monitor and troubleshoot issues within the IT Infrastructure environment. These environments exist on-premise as well as in AWS and are a mixture of physical and virtual. This position will also be responsible for working with multiple teams globally to create support and escalation procedures based on the impact on the business. What You’ll Do : Remote management and monitoring of 24 x 7 Command Center Operations Incident Management for all company-wide Major Incidents. Monitor the Data ingestion, Data Processing, and Delivery processes for all Comscore’s products. Provide L1 support for servers hosted on AWS and on-premises. Managing Grafana and Prometheus across clouds. Patch management for Windows and Linux servers, including SQL servers. Ability to work with SOPs. Troubleshoot, prioritize, and escalate issues to concerned technical teams. Ability to communicate severe issues. Provide on-call support with excellent English language communication skills, both verbally and in writing. User Management in LDAP. Experience and/or comfort with communicating with employees and clients around the world, primarily in the US, at all levels of the organization. Help in process improvement and documentation. What You’ll Need : Bachelor’s Degree in Computer Science or a related field 2 to 4 years of infrastructure or Product operations experience. Excellent written and verbal communication skills. Experience in monitoring & Service management tools like Nagios, JIRA, and Pager Duty. Basic Experience working on AWS-related services like CloudWatch, Athena, S3, EMR, etc. Basic Experience in managing/administering observability tools like Grafana and Prometheus. Basic Experience in working with any Database and Query Language. Ability to automate different server admin tasks using PowerShell, Bash, etc. Experience in Microsoft-based server operating systems – 2012, 2016, 2019. Understanding of ITIL – Incident Management, Problem Management. Attention to detail. Ability to follow complex and detailed instructions Proactive problem-solving skills Certifications in the related field are a plus. Benefits: Medical: Comscore offers a collective Private Medical Insurance scheme which is 100% covered by Comscore. The benefit is applicable to employees, an employee’s spouse, up to two children and parents. Pension: Provident Fund: Comscore bears both the employee and employer contribution. Time Off Annual Leave: Comscore offers market competitive annual leave of 26 Annual Leave Days (8 Casual and 18 Privilege), following local guidelines and practices. National Holidays and Festival Holidays: 10 Days. Sick Leave: 10 Days. Additional Leave: Paternity, Bereavement, Marriage, Maternity, Additional Pregnancy / Birth Related Leave • Christmas / New Year Paid Leave, Comscore offers a week of Company paid leave over the Christmas / New Year period. Summer Hours: Comscore has a culture that rewards employees for their hard work. When you work hard, you need time to recharge and refresh. Early releases on Fridays are subject to manager approval. Internal Career Development Opportunities (minimum of 6 months tenure in the current position and in discussion with supervisors) Access to hundreds of professional e-learning courses, specifically created for Comscore Be creative: You don’t have to follow the norm to be successful – we encourage you to think outside the box. Our culture is built on encouraging innovative ideas, communication and joint success. Informal Work Atmosphere: We believe in getting the job done in a comfortable, casual environment! The ability to become a truly global engineer, with exposure to markets across the world. With more than 30 offices around the world, many Comscore teams work together across locations. About Comscore : At Comscore, we’re pioneering the future of cross-platform media measurement, arming organizations with the insights they need to make decisions with confidence. Central to this aim are our people who work together to simplify the complex on behalf of our clients & partners. Though our roles and skills are varied, we’re united by our commitment to five underlying values: Integrity, Velocity, Accountability, Teamwork, and Servant Leadership. If you’re motivated by big challenges and interested in helping some of the largest and most important media properties and brands navigate the future of media, we’d love to hear from you. This will be a foundational role on our Pune-based Engineering team during a time of exponential growth for Comscore in Pune. The candidate will work with Comscore teams around the world on work vital to the future of Comscore and our clients. Comscore (NASDAQ: SCOR) is a trusted partner for planning, transacting, and evaluating media across platforms. With a data footprint that combines digital, linear TV, over-the-top, and theatrical viewership intelligence with advanced audience insights, Comscore allows media buyers and sellers to quantify their multiscreen behavior and make business decisions with confidence. A proven leader in measuring digital and set-top box audiences and advertising at scale, Comscore is the industry’s emerging, third-party source for reliable and comprehensive cross-platform measurement. To learn more about Comscore, please visit Comscore.com. About Comscore: At Comscore, we’re pioneering the future of cross-platform media measurement, arming organizations with the insights they need to make decisions with confidence. Central to this aim are our people who work together to simplify the complex on behalf of our clients & partners. Though our roles and skills are varied, we’re united by our commitment to five underlying values: Integrity, Velocity, Accountability, Teamwork, and Servant Leadership. If you’re motivated by big challenges and interested in helping some of the largest and most important media properties and brands navigate the future of media, we’d love to hear from you. Comscore (NASDAQ: SCOR) is a trusted partner for planning, transacting and evaluating media across platforms. With a data footprint that combines digital, linear TV, over-the-top and theatrical viewership intelligence with advanced audience insights, Comscore allows media buyers and sellers to quantify their multiscreen behavior and make business decisions with confidence. A proven leader in measuring digital and set-top box audiences and advertising at scale, Comscore is the industry’s emerging, third-party source for reliable and comprehensive cross-platform measurement. To learn more about Comscore, please visit Comscore.com. C omscore is committed to creating an inclusive culture, encouraging diversity. *LI-JL1
Posted 1 week ago
10.0 - 15.0 years
16 - 27 Lacs
Pune
Work from Office
Dear Candidate, This is with reference for Opportunity for Lead - AWS Data Engineering professionals PFB the Job Description Responsibilities: Lead the design and implementation of AWS-based data architectures and pipelines. Architect and optimize data solutions using AWS services such as S3, Redshift, Glue, EMR, and Lambda. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in AWS data environments. Stay updated on the latest AWS technologies and industry trends. Key Technical Skills & Responsibilities Experience in design and development of cloud data platforms using AWS services Must have experience of design and development of data lake / data warehouse / data analytics solutions using AWS services like S3, Lake Formation, Glue, Athena, EMR, Lambda, Redshift Must be aware about the AWS access control and data security features like VPC, IAM, Security Groups, KMS etc Must be good with Python and PySpark for data pipeline building. Must have data modeling including S3 data organization experience Must have an understanding of hadoop components, No SQL database, graph database and time series database; and AWS services available for those technologies Must have experience of working with structured, semi-structured and unstructured data Must have experience of streaming data collection and processing. Kafka experience is preferred. Experience of migrating data warehouse / big data application to AWS is preferred . Must be able to use Gen AI services (like Amazon Q) for productivity gain Eligibility Criteria: Bachelors degree in computer science, Data Engineering, or a related field. Extensive experience with AWS data services and tools. AWS certification (e.g., AWS Certified Data Analytics - Specialty). Experience with machine learning and AI integration in AWS environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Skill - Senior Tech Lead AWS Data Engg Location - Pune
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Nellore
Work from Office
Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibilityAll things equal, one with high agency winsEssentiallySports is a top 10 sports media platform in the U. S. , generating over a billion pageviews a year and 30m+ monthly active users per month. This massive traffic fuels our data-driven culture, allowing us to build owned audiences at scale through organic growth—a model we take pride in, with zero CAC. The next phase of ES growth is around newsletter initiative, in less than 9 months, we’ve built a robust newsletter brand with 700,000+ highly engaged readers and impressive performance metrics:5 newsletter brands700k+ subscribersOpen rates of 40%-46%. The role is for a data engineer with growth and business acumen, in the “permissionless growth” team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYou’ll make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYou’re comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Ballari
Work from Office
Responsibilities - Owning Data Pipeline from Web to Athena to Email, end-to-end Youll make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week day. Personalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you? 2+ years of professional data engineering experience Someone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch points You have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them up Youre comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goals Problem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguity Excited to join a lean team in a big company that moves quickly
Posted 1 week ago
4.0 - 6.0 years
9 - 13 Lacs
Mangaluru
Work from Office
Role Overview: EssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact. This is not a traditional marketing roleyour job is to engineer growth through product innovation, user journey optimization, and experimentation. Youll be the bridge between editorial, tech, and analyticsturning insights into actions that drive sustainable audience and revenue growth. Key Responsibilities Own the entire web user journey from page discovery to conversion to retention. Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior. Optimize high-traffic areas of the sitelanding pages, article CTAs, newsletter modulesfor conversion and time-on-page. Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows. Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users. Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops. Monitor analytics pipelines from GA4 ? Athena ? dashboards to derive insights and drive decision-making. Introduce AI-driven features (LLM prompts, content auto-summaries, etc. ) that personalize or simplify the user experience. Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities. Who you are 4+ years of experience in product growth, web engagement, or analytics-heavy roles. Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops. Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics. Strong proficiency in SQL, GA4, marketing analytics, and campaign management Understand customer segmentation, LTV analysis, cohort behavior, and user funnel optimization Thrive in ambiguity and love building things from scratch Passionate about AI, automation, and building sustainable growth engines. Thinks like a founder: drives initiatives independently, hunts for insights, moves fast A team player who collaborates across engineering, growth, and editorial teams. Proactive and solution-oriented, always spotting opportunities for real growth. Thrive in a fast-moving environment, taking ownership and driving impact.
Posted 1 week ago
4.0 - 6.0 years
9 - 13 Lacs
Kolhapur
Work from Office
Role Overview: EssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact. This is not a traditional marketing roleyour job is to engineer growth through product innovation, user journey optimization, and experimentation. Youll be the bridge between editorial, tech, and analyticsturning insights into actions that drive sustainable audience and revenue growth. Key Responsibilities Own the entire web user journey from page discovery to conversion to retention. Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior. Optimize high-traffic areas of the sitelanding pages, article CTAs, newsletter modulesfor conversion and time-on-page. Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows. Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users. Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops. Monitor analytics pipelines from GA4 ? Athena ? dashboards to derive insights and drive decision-making. Introduce AI-driven features (LLM prompts, content auto-summaries, etc. ) that personalize or simplify the user experience. Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities. Who you are 4+ years of experience in product growth, web engagement, or analytics-heavy roles. Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops. Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics. Strong proficiency in SQL, GA4, marketing analytics, and campaign management Understand customer segmentation, LTV analysis, cohort behavior, and user funnel optimization Thrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth engines Thinks like a founder: drives initiatives independently, hunts for insights, moves fast A team player who collaborates across engineering, growth, and editorial teams. Proactive and solution-oriented, always spotting opportunities for real growth. Thrive in a fast-moving environment, taking ownership and driving impact.
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Dindigul
Work from Office
Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibilityAll things equal, one with high agency wins The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Kota
Work from Office
Full Time Role at EssentiallySports for Data Growth Engineer EssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibilityAll things equal, one with high agency wins The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you? 2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Kannur
Work from Office
Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibility The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Davangere
Work from Office
Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibility The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Eluru
Work from Office
The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you? 2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough