Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
30 - 38 Lacs
Gurgaon
Remote
Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: AWS Glue Catalog : 4 years (Required) Data Engineering: 5 years (Required) AWS CDK, Cloud-formation, Lambda, Step-function : 5 years (Required) AWS Elastic MapReduce (EMR) : 4 years (Required) Work Location: In person
Posted 3 weeks ago
2.0 years
10 Lacs
Gurgaon
On-site
Gurgaon, India We are seeking an Associate Consultant to join our India team based in Gurgaon. This role at Viscadia offers a unique opportunity to gain hands-on experience in the healthcare industry, with comprehensive training in core consulting skills such as critical thinking, market analysis, and executive communication. Through project work and direct mentorship, you will develop a deep understanding of healthcare business dynamics and build a strong foundation for a successful consulting career. ROLES AND RESPONSIBILITIES Technical Responsibilities Design and build full-stack forecasting and simulation platforms using modern web technologies (e.g., React, Node.js, Python) hosted on AWS infrastructure (e.g., Lambda, EC2, S3, RDS, API Gateway). Automate data pipelines and model workflows using Python for data preprocessing, time-series modeling (e.g., ARIMA, Exponential Smoothing), and backend services. Develop and enhance product positioning, messaging, and resources that support the differentiation of Viscadia from its competitors. Conduct research and focus groups to elucidate key insights that augment positioning and messaging Replace legacy Excel/VBA tools with scalable, cloud-native applications, integrating dynamic reporting features and user controls via web UI. Use SQL and cloud databases (e.g., AWS RDS, Redshift) to query and transform large datasets as inputs to models and dashboards. Develop interactive web dashboards using frameworks like React + D3.js or embed tools like Power BI/Tableau into web portals to communicate insights effectively. Implement secure, modular APIs and microservices to support modularity, scalability, and seamless data exchange across platforms. Ensure cost-effective and reliable deployment of solutions via AWS services, CI/CD pipelines, and infrastructure-as-code (e.g., CloudFormation, Terraform). Business Responsibilities Support the development and enhancement of forecasting and analytics platforms tailored to the needs of pharmaceutical clients across various therapeutic areas Build in depth understanding of pharma forecasting concepts, disease areas, treatment landscapes, and market dynamics to contextualize forecasting models and inform platform features Partner with cross-functional teams to ensure forecast deliverables align with client objectives, timelines, and decision-making needs Contribute to a culture of knowledge sharing and continuous improvement by mentoring junior team members and helping codify best practices in forecasting and business analytics Grow into a client-facing role, combining an understanding of commercial strategy with forecasting expertise to lead engagements and drive value for clients QUALIFICATIONS Bachelor’s degree (B.Tech/B.E.) from a premier engineering institute, preferably in Computer Science, Information Technology, Electrical Engineering, or related disciplines 2+ years of experience in full-stack development, with a strong focus on designing, developing, and maintaining AWS-based applications and services SKILLS & TECHNICAL PROFICIENCIES Technical Skills Proficient in Python, with practical experience using libraries such as pandas, NumPy, matplotlib/seaborn, and statsmodels for data analysis and statistical modeling Strong command of SQL for data querying, transformation, and seamless integration with backend systems Hands-on experience in designing and maintaining ETL/ELT data pipelines, ensuring efficient and scalable data workflows Solid understanding and applied experience with cloud platforms, particularly AWS; working familiarity with Azure and Google Cloud Platform (GCP) Full-stack web development expertise, including building and deploying modern web applications, web hosting, and API integration Proficient in Microsoft Excel and PowerPoint, with advanced skills in data visualization and delivering professional presentations Soft Skills Excellent verbal and written communication skills, with the ability to effectively engage both technical and non-technical stakeholders Strong analytical thinking and problem-solving abilities, with a structured and solution-oriented mindset Demonstrated ability to work independently as well as collaboratively within cross-functional teams Adaptable and proactive, with a willingness to thrive in a dynamic, fast-growing environment Genuine passion for consulting, with a focus on delivering tangible business value for clients Domain Expertise (Good to have) Strong understanding of pharmaceutical commercial models, including treatment journeys, market dynamics, and key therapeutic areas Experience working with and interpreting industry-standard datasets such as IQVIA, Symphony Health, or similar secondary data sources Familiarity with product lifecycle management, market access considerations, and sales performance tracking metrics used across the pharmaceutical value chain
Posted 3 weeks ago
4.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
3.0 years
6 - 9 Lacs
Ahmedabad
Remote
Job Title: Power BI Developer Location: Ahmedabad, Gujarat (Preferred) Experience Required: 3+ Years Employment Type: Full-time (Immediate Joiners Preferred) About IGNEK: At IGNEK, we specialize in remote staff augmentation and custom development solutions, offering expert teams in technologies like Liferay, AEM, Java, React, and Node.js. We help global clients meet their project goals efficiently by delivering innovative and scalable digital solutions. Job Summary: We’re looking for an experienced Power BI Developer to join our analytics team at IGNEK. The ideal candidate will be responsible for transforming complex data into visually impactful dashboards and providing actionable insights for data-driven decision-making. Key Responsibilities: Develop, maintain, and optimize interactive Power BI dashboards and reports. Write complex SQL queries to extract, clean, and join data from multiple sources including data warehouses and APIs. Understand business requirements and collaborate with cross-functional teams to deliver scalable BI solutions. Ensure data accuracy and integrity across all reporting outputs. Create robust data models and DAX measures within Power BI. Work with data engineers and analysts to streamline data pipelines. Maintain documentation for all dashboards, definitions, and processes. (Optional) Use Python for automation, data manipulation, or API integration. Requirements: 3+ years of experience in BI or Analytics roles. Strong expertise in Power BI , including DAX, Power Query, and data modeling. Advanced SQL skills and experience with relational databases or cloud data warehouses (e.g., SQL Server, Redshift, Snowflake). Understanding of ETL processes and data quality management. Ability to communicate data-driven insights effectively to stakeholders. Bonus: Working knowledge of Python for scripting or automation. Preferred Qualifications: Hands-on experience with Power BI Service , Power BI Gateway , or Azure . Exposure to agile methodologies and collaborative development teams. Familiarity with key business metrics across functions like sales, operations, or finance. How to Apply: Please send your resume and a cover letter detailing your experience to Job Type: Full-time Pay: ₹600,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Leave encashment Provident Fund Work from home Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Exp - 5-8 Years Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies.
Posted 3 weeks ago
4.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Description: Data Engineer (Entry-Level) About Amvion Labs Amvion Labs is on a mission to help organizations unlock the full potential of their data using cutting-edge cloud and analytics technologies. As we expand our Data Science and Engineering team, we’re looking for dynamic young professionals ready to learn, innovate, and contribute to impactful projects across industries. This is your opportunity to work with Snowflake, modern cloud platforms (AWS/Azure), and advanced data engineering tools while being mentored by experienced leaders. Key Responsibilities · Building and maintaining data pipelines using Python, SQL, and ETL frameworks. · Supporting data preparation, cleaning, and transformation for analytics and reporting. · Designing and implementing data warehouse solutions on Snowflake. · Developing and maintaining interactive dashboards using Power BI or Tableau for business insights. · Helping optimize data queries and models for performance and scalability. · Integrating data from multiple sources like MySQL, APIs, CSVs, and cloud storage systems. · Collaborating with data scientists, analysts, and business teams to solve real-world data challenges. · Staying up to date with emerging trends in cloud data engineering, Snowflake, and big data technologies. Required Skills & Qualifications · Bachelor’s or Master’s degree in Computer Science, Data Analytics, or related fields. · Strong foundation in Python (Pandas, NumPy) and SQL. · Understanding of database concepts (schemas, tables, views, stored procedures). · Familiarity with ETL concepts and data transformation workflows. · Basic knowledge of visualization tools (Power BI, Tableau, or Looker Studio). · Excellent problem-solving and analytical thinking abilities. · Exposure to cloud environments (AWS, Azure, GCP, or Oracle Cloud). · Knowledge of Snowflake, Redshift, BigQuery, or similar cloud data warehouses (nice to have). · Familiarity with Big Data tools (PySpark, Spark) or AI/ML concepts (nice to have). · Academic projects, internships, or certifications in data engineering or analytics (nice to have). Why Join Us? · Hands-on training in Snowflake and cloud data engineering. · Opportunity to work on live client projects from Day 1. · Mentorship from seasoned Data Science and Cloud leaders. · Dynamic, collaborative culture focused on continuous learning and innovation. · Competitive salary with fast growth opportunities.
Posted 3 weeks ago
4.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Lead Analytics Engineer We are seeking a talented, motivated and self-driven professional to join the HH Digital, Data & Analytics (HHDDA) organization and play an active role in Human Health transformation journey to become the premier “Data First” commercial biopharma organization. As a Lead Analytics Engineer, you will be part of the HHDDA Commercial Data Solutions team, providing technical/data expertise development of analytical data products to enable data science & analytics use cases. In this role, you will create and maintain data assets/domains used in the commercial/marketing analytics space – to develop best-in-class data pipelines and products, working closely with data product owners to translate data product requirements and user stories into development activities throughout all phases of design, planning, execution, testing, deployment and delivery. Your specific responsibilities will include Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required Experience 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science and visualization/reporting products, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI) Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R323237
Posted 3 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
This role is for one of the Weekday's clients Min Experience: 4 years Location: Bengaluru JobType: full-time Requirements Key Responsibilities As a Data Engineer, you will play a crucial role in designing and maintaining scalable and high-performance data systems. Your responsibilities will include: Data Pipeline Development and Management Design, build, test, and maintain efficient data pipelines and data management systems. Develop and manage ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to integrate data from diverse sources such as databases, APIs, and real-time streams. Data Modeling and Architecture Design data models and implement schemas for data warehouses and data lakes to support analytics and business operations. Optimize data storage, access, and performance for scalability and maintainability. Data Quality and Integrity Implement validation, cleansing, and monitoring to maintain data accuracy, consistency, and reliability. Define and enforce best practices and standards for data governance and quality. Infrastructure Management Manage and monitor key data infrastructure components including databases, data lakes, and distributed computing environments. Apply data security protocols and ensure proper access controls are in place. Automation and Optimization Automate data workflows and pipelines to improve reliability and performance. Continuously monitor and fine-tune systems for operational efficiency. Collaboration and Support Partner with data scientists, analysts, software engineers, and business stakeholders to gather requirements and provide scalable data solutions. Document processes, workflows, and system designs; support cross-functional teams with technical guidance. Technology Evaluation Stay current with emerging tools and technologies in the data engineering space. Evaluate and recommend new solutions to enhance data capabilities and performance. Education And Experience Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, Data Science, or a related field. 5 to 7 years of experience in data engineering, software development, or a similar domain. Technical & Functional Competencies Required Skills & Qualifications Technical Proficiency Programming: Strong experience in Python and SQL. Databases: Proficient in relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases. Data Warehousing & Lakes: Hands-on experience with platforms like Snowflake, Redshift, BigQuery. ETL/ELT Tools: Proficiency with tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend. Big Data: Working knowledge of Apache Spark or similar big data technologies. Cloud Platforms: Experience with AWS, Azure, or GCP for data engineering workflows. Data Modeling: Strong understanding of modeling techniques and best practices. API Integration: Ability to build and consume APIs for data integration. Version Control: Experience with Git or other version control systems. Soft Skills Analytical mindset with a strong problem-solving approach. Excellent communication skills for both technical and non-technical audiences. Team player with a collaborative work ethic. Detail-oriented with a commitment to data quality. Adaptability to new technologies and changing project requirements. Key Skills: ETL, Data Modeling, Data Architecture, Cloud Data Platforms, Python, SQL, Big Data, Data Warehousing, API Integration
Posted 3 weeks ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Data Architect is responsible to define and lead the Data Architecture, Data Quality, Data Governance, ingesting, processing, and storing millions of rows of data per day. This hands-on role helps solve real big data problems. You will be working with our product, business, engineering stakeholders, understanding our current eco-systems, and then building consensus to designing solutions, writing codes and automation, defining standards, establishing best practices across the company and building world-class data solutions and applications that power crucial business decisions throughout the organization. We are looking for an open-minded, structured thinker passionate about building systems at scale. Role Design, implement and lead Data Architecture, Data Quality, Data Governance Defining data modeling standards and foundational best practices Develop and evangelize data quality standards and practices Establish data governance processes, procedures, policies, and guidelines to maintain the integrity and security of the data Drive the successful adoption of organizational data utilization and self-serviced data platforms Create and maintain critical data standards and metadata that allows data to be understood and leveraged as a shared asset Develop standards and write template codes for sourcing, collecting, and transforming data for streaming or batch processing data Design data schemes, object models, and flow diagrams to structure, store, process, and integrate data Provide architectural assessments, strategies, and roadmaps for data management Apply hands-on subject matter expertise in the Architecture and administration of Big Data platforms, Data Lake Technologies (AWS S3/Hive), and experience with ML and Data Science platforms Implement and manage industry best practice tools and processes such as Data Lake, Databricks, Delta Lake, S3, Spark ETL, Airflow, Hive Catalog, Redshift, Kafka, Kubernetes, Docker, CI/CD Translate big data and analytics requirements into data models that will operate at a large scale and high performance and guide the data analytics engineers on these data models Define templates and processes for the design and analysis of data models, data flows, and integration Lead and mentor Data Analytics team members in best practices, processes, and technologies in Data platforms Qualifications B.S. or M.S. in Computer Science, or equivalent degree 10+ years of hands-on experience in Data Warehouse, ETL, Data Modeling & Reporting 7+ years of hands-on experience in productionizing and deploying Big Data platforms and applications, Hands-on experience working with: Relational/SQL, distributed columnar data stores/NoSQL databases, time-series databases, Spark streaming, Kafka, Hive, Delta Parquet, Avro, and more Extensive experience in understanding a variety of complex business use cases and modeling the data in the data warehouse Highly skilled in SQL, Python, Spark, AWS S3, Hive Data Catalog, Parquet, Redshift, Airflow, and Tableau or similar tools Proven experience in building a Custom Enterprise Data Warehouse or implementing tools like Data Catalogs, Spark, Tableau, Kubernetes, and Docker Knowledge of infrastructure requirements such as Networking, Storage, and Hardware Optimization with hands-on experience in Amazon Web Services (AWS) Strong verbal and written communications skills are a must and should work effectively across internal and external organizations and virtual teams Demonstrated industry leadership in the fields of Data Warehousing, Data Science, and Big Data related technologies Strong understanding of distributed systems and container-based development using Docker and Kubernetes ecosystem Deep knowledge of data structures and algorithms Experience working in large teams using CI/CD and agile methodologies Unique ID -
Posted 3 weeks ago
6.0 - 8.0 years
18 - 20 Lacs
Bengaluru
Hybrid
Hi all, We are hiring for the role C&S ETL Engineer Experience: 6 - 8 Years Location: Bangalore Notice Period: Immediate - 15 Days Skills: Mandatory Skills: AWS Glue Job Description: Minimum experience of 6 years in building, optimizing, and maintaining scalable data pipelines as an ETL Engineer. Hands-on experience in coding techniques with a proven record. Hands-on experience in end-to-end data workflows, including pulling data from third-party and in-house tools via APIs, transforming and loading it into data warehouses, and improving performance across the ETL lifecycle. Hands-on experience with scripting (Python, shell scripting), relational databases (PostgreSQL, Redshift), REST APIs (OAuth, JWT, Basic Auth), job scheduler (cron), version control system (Git), and in AWS environment. Hands-on experience in integrating data from various data sources. Understanding of Agile processes and principles. Good communication and presentation skills. Good documentation skills. Preferred: Ability to understand business problems and customer needs and provide data solutions. Hands-on experience in working with Qualys and its APIs. Understanding of business intelligence tools such as PowerBI. Knowledge of data security and privacy. Design, develop, implement, and maintain robust and scalable ETL pipelines using Python and SQL as well as AWS Glue and AWS Lambda for data ingestion, transformation, loading into various data targets (e.g., PostgreSQL, Amazon S3, Redshift, Aurora) and structured data management. If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793
Posted 3 weeks ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Data Engineering Architect, develop, and maintain highly scalable and robust data pipelines using Apache Kafka, Apache Spark, and Apache Airflow. Design and optimize data storage solutions, including Amazon Redshift, S3, or comparable platforms, to support large-scale analytics. Ensure data quality, integrity, security, and compliance across all data platforms. Data Science Design, develop, and deploy sophisticated machine learning models to solve complex business challenges. Build and optimize end-to-end ML pipelines, including data preprocessing, feature engineering, model training, and deployment. Drive Generative AI initiatives, creating innovative products and solutions. Conduct in-depth analysis of large, complex datasets to generate actionable insights and recommendations. Work closely with cross-functional stakeholders to understand business requirements and deliver data-driven strategies. Collaboration and Mentorship Mentor and guide junior data scientists and data engineers, fostering a culture of continuous learning and professional growth. Contribute to the development of best practices in Data Science, Data Engineering, and Generative AI. General Write clean, efficient, and maintainable code in Python and at least one other language (e.g., C#, Go, or equivalent). Participate in code reviews, ensuring adherence to best practices and coding standards. Stay abreast of the latest industry trends, tools, and technologies in Data Engineering, Data Science, and Generative AI. Document processes, models, and workflows to ensure knowledge sharing and reproducibility.
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies. Skills: Agile, Tableau, BigQuery, SQL, T-SQL, data modelling, Snowflake, redshift, CDP Platform, Power BI, Data Visualization.
Posted 3 weeks ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
3-5 years of Professional experience designing/building/maintaining highly available data and analytics platform. 3+ years of experience in data engineering, with a focus on building large-scale data processing systems. Hands-on experience with AWS or similar cloud platform building data engineering solutions for analytics and science. (2+ years) Must have experience building complex data pipelines – batch and/or real time event-based processing (2+ years) Strong experience in designing, building and maintaining data warehouse in Redshift or similar cloud-based solutions. (2+ years) Experience in Matillion or similar ETL/ELT tool for developing data ingestion and curation flow (2+ years) Must have strong hands-on experience in SQL. (2+ years) Strong hands-on experience in modern scripting languages using Python. (2+ years) Experience building complex ETL using Spark (Scala or Python) for event based big data processing (1+ years) Strong hands-on experience with NoSQL DBs – MongoDB, Cassandra or DynamoDB (1+ years) Strong experience with AWS deployment using CI/CD pipeline is preferred. (1+ years) Experience in infrastructure as a code services like Terraform preferred. (1+ years) Experience building mission critical systems, running 24x7. Desire to work within a team of engineers at all levels of experience. Desire to mentor junior developers, maximizing their productivity. Good written and spoken communication skills. Bangalore Karnataka India
Posted 3 weeks ago
4.0 years
0 Lacs
Mumbai, Maharashtra
Remote
Solution Engineering - Cloud & AI - Data Mumbai, Maharashtra, India Date posted Jul 16, 2025 Job number 1847893 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Solution Engineering Employment type Full-Time Overview Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Qualifications 10+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 3 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Amazon’s Consumer Payments organization is seeking a highly quantitative, experienced Business Intelligence Engineer to drive the development of analytics and insights. You will succeed in this role if you are an organized self-starter who can learn new technologies quickly and excel in a fast-paced environment. In this position, you will be a key contributor and sparring partner, developing analytics and insights that global executive management teams and business leaders will use to define global strategies and deep dive businesses. Our team offers a unique opportunity to build a new set of analytical experiences from the ground up. You will be part the team that is focused on acquiring new merchants from around the world to payments around the world. The position is based in India but will interact with global leaders and teams in Europe, Japan, US, and other regions. You should be highly analytical, resourceful, customer focused, team oriented, and have an ability to work independently under time constraints to meet deadlines. You will be comfortable thinking big and diving deep. A proven track record in taking on end-to-end ownership and successfully delivering results in a fast-paced, dynamic business environment is strongly preferred. Key job responsibilities - Partnering with engineering, product, business and finance teams to create key performance indicators and new methodologies for measurement - Fluency in analytical communication to translate data into actionable insights for stakeholders both in and out of the team, create analytical insights to identify key priorities - Proactively make and justify recommendations based on advanced statistical techniques, deep familiarity with the customer or developer experiences, as well as by cross-referencing multiple data sources and comparing against the wider industry - Determine best-in-class performance reports and automate reporting for regular metrics, identify areas of opportunity to automate, scale ad-hoc analyses, build and inform BI tool improvements - Providing requirements for telemetry and data structure to improve ability to extract data efficiently and provide the team insights faster A day in the life - Analyze data and find insights to either drive strategic business decisions or to drive incremental signups or revenue. - Define and develop business critical metrics and reports across all international business levers, key performance indicators, and financials. - Own alignment and standardization of analytical initiatives across the global business teams - Drive efforts across international business leaders, BI leaders and executive management across Europe, Asia and North America. - Own key executive reports and metrics that are consumed by our VPs and Directors - Provide thought leadership in global business deep dives across a variety of key performance indicators Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Position: Associate Industry: CPG & Retail Domain: Range of Analytics (Descriptive to Advanced) depending on the client problem About Acceleration Center Bangalore At PwC, we connect people with diverse backgrounds and skill sets to solve important problems together and lead with purpose—for our clients, our communities and for the world at large. It is no surprise therefore that 429 of 500 Fortune global companies engage with PwC. Acceleration Centers (ACs) are PwC’s diverse, global talent hubs focused on enabling growth for the organization and value creation for our clients. The PwC Advisory Acceleration Center in Bangalore is part of our Advisory business in the US. The team is focused on developing a broader portfolio with solutions for Risk Consulting, Management Consulting, Technology Consulting, Strategy Consulting, Forensics as well as vertical specific solutions. PwC's high-performance culture is based on passion for excellence with a focus on diversity and inclusion. You will collaborate with and receive support from a network of people to achieve your goals. We will also provide you with global leadership development frameworks and the latest in digital technologies to learn and excel in your career. At the core of our firm's philosophy is a simple construct: We care for our people. Globally PwC is ranked the 3rd most attractive employer according to Universum. Our commitment to Responsible Business Leadership, Diversity & Inclusion, work-life flexibility, career coaching and learning & development makes our firm one of the best places to work, learn and excel. We are looking for experienced professionals with a strong data science background (and overall data science experience of 4+ years) to work in our Analytics Consulting practice in Mumbai and Bangalore. Senior Associate’s will work as an integral part of business analytics teams in India alongside clients and consultants in the U.S., leading teams for high-end analytics consulting engagements and providing business recommendations to project teams. Education: Advanced Degree in quantitative discipline such as Computer Science, Engineering, Econometrics, Statistics, Operations Research or Information Sciences such as business analytics or informatics. Required Skills: Successful candidates will have demonstrated the following skills and characteristics - Must-Have (Senior Associate - CPG & Retail Analytics) Hands-on experience with CPG & Retail data assets such as point-of-sale (POS) transactions, loyalty-program histories, syndicated panel data (NielsenIQ, Circana/IRI, Kantar), e-commerce clickstream, and supply-chain sensors/IoT. Deep understanding of predictive modeling for purchase propensity, demand forecasting, promotion uplift, assortment, and customer lifetime value in omnichannel retail environments. Proficiency in machine learning techniques—classification, regression, clustering, recommendation engines, and advanced time-series forecasting applied to sales and inventory data. Strong command of statistical methods including A/B & multivariate testing, price-elasticity modeling, market-basket analysis, segmentation, and causal-impact evaluation. Expert programming in Python or R, plus SQL with retail-specific data-wrangling libraries (pandas, dbt) across cloud warehouses (BigQuery, Redshift, Synapse). Hands on BI fluency in developing interactive visualizations in Power BI, Tableau, or Looker that turn SKU-level data into actionable stories for category, trade-marketing, and supply-chain teams. Exposure to big data & real-time analytics stacks used in retail (Spark, Kafka, Delta Lake, Snowflake Streaming, Google Vertex AI). Ability to translate insights for merchandising, marketing, supply-chain, and store-operations stakeholders—communicating clearly with both technical data teams and business leaders. Nice to have Exposure to ML engineering (Azure ML, AWS SageMaker, GCP Vertex AI) Experience blending retail-media impression/click logs with sales to calculate ROAS and incrementality, or to build MMM-lite dashboards for brand teams. Roles and Responsibilities Assist analytics projects within the CPG & Retail domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities
Posted 3 weeks ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
A career in our Advisory Acceleration Centre is the natural extension of PwC’s leading class global delivery capabilities. We provide premium, cost effective, high quality services that support process quality and delivery capability in support for client engagements. Years of Experience: Candidates with 4+ years of hands on experience Position Requirements Must Have : Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions Strong expertise in end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake, Data hub in AWS Proficient in Lambda or Kappa Architectures Should be aware of Data Management concepts and Data Modelling Strong AWS hands-on expertise with a programming background preferably Python/Scala Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory Strong experience in AWS compute services like AWS EMR, Glue and storage services like S3, Redshift & Dynamodb Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming Strong understanding of DBT ELT Tool, and usage of DBT macros etc Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Good knowledge in AWS Security and AWS Key management Strong understanding of Cloud data migration processes, methods and project lifecycle Good analytical & problem-solving skills Good communication and presentation skills Good analytical & problem-solving skills Good communication and presentation skills
Posted 3 weeks ago
2.0 years
0 Lacs
Gurgaon Rural, Haryana, India
On-site
We are seeking an Full Stack Developer (Associate Consultant) to join our India team based in Gurgaon. This role at Viscadia offers a unique opportunity to gain hands-on experience in the healthcare industry, with comprehensive training in core consulting skills such as critical thinking, market analysis, and executive communication. Through project work and direct mentorship, you will develop a deep understanding of healthcare business dynamics and build a strong foundation for a successful consulting career. ROLES AND RESPONSIBILITIES Technical Responsibilities Design and build full-stack forecasting and simulation platforms using modern web technologies (e.g., React, Node.js, Python) hosted on AWS infrastructure (e.g., Lambda, EC2, S3, RDS, API Gateway). Automate data pipelines and model workflows using Python for data preprocessing, time-series modeling (e.g., ARIMA, Exponential Smoothing), and backend services. Replace legacy Excel/VBA tools with scalable, cloud-native applications, integrating dynamic reporting features and user controls via web UI. Use SQL and cloud databases (e.g., AWS RDS, Redshift) to query and transform large datasets as inputs to models and dashboards. Develop interactive web dashboards using frameworks like React + D3.js or embed tools like Power BI/Tableau into web portals to communicate insights effectively. Implement secure, modular APIs and microservices to support modularity, scalability, and seamless data exchange across platforms. Ensure cost-effective and reliable deployment of solutions via AWS services, CI/CD pipelines, and infrastructure-as-code (e.g., CloudFormation, Terraform). Business Responsibilities Support the development and enhancement of forecasting and analytics platforms tailored to the needs of pharmaceutical clients across various therapeutic areas Build in depth understanding of pharma forecasting concepts, disease areas, treatment landscapes, and market dynamics to contextualize forecasting models and inform platform features Partner with cross-functional teams to ensure forecast deliverables align with client objectives, timelines, and decision-making needs Contribute to a culture of knowledge sharing and continuous improvement by mentoring junior team members and helping codify best practices in forecasting and business analytics Grow into a client-facing role, combining an understanding of commercial strategy with forecasting expertise to lead engagements and drive value for clients QUALIFICATIONS Bachelor’s degree (B.Tech/B.E.) from a premier engineering institute, preferably in Computer Science, Information Technology, Electrical Engineering, or related disciplines 2+ years of experience in full-stack development, with a strong focus on designing, developing, and maintaining AWS-based applications and services SKILLS AND TECHNICAL PROFICIENCIES Technical Skills Proficient in Python, with practical experience using libraries such as pandas, NumPy, matplotlib/seaborn, and statsmodels for data analysis and statistical modeling Strong command of SQL for data querying, transformation, and seamless integration with backend systems Hands-on experience in designing and maintaining ETL/ELT data pipelines, ensuring efficient and scalable data workflows Solid understanding and applied experience with cloud platforms, particularly AWS; working familiarity with Azure and Google Cloud Platform (GCP) Full-stack web development expertise, including building and deploying modern web applications, web hosting, and API integration Proficient in Microsoft Excel and PowerPoint, with advanced skills in data visualization and delivering professional presentations Soft Skills Excellent verbal and written communication skills, with the ability to effectively engage both technical and non-technical stakeholders Strong analytical thinking and problem-solving abilities, with a structured and solution-oriented mindset Demonstrated ability to work independently as well as collaboratively within cross-functional teams Adaptable and proactive, with a willingness to thrive in a dynamic, fast-growing environment Genuine passion for consulting, with a focus on delivering tangible business value for clients Domain Expertise Strong understanding of pharmaceutical commercial models, including treatment journeys, market dynamics, and key therapeutic areas Experience working with and interpreting industry-standard datasets such as IQVIA, Symphony Health, or similar secondary data sources Familiarity with product lifecycle management, market access considerations, and sales performance tracking metrics used across the pharmaceutical value chain
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Zimetrics is a technology services and solutions provider specializing in Data, AI, and Digital. We help enterprises leverage the economic potential and business value of data from systems, machines, connected devices, and human-generated content. Our core principles are Integrity, Intellect, and Ingenuity, guiding our value system, engineering expertise, and organizational behavior. We are problem solvers and innovators who challenge conventional wisdom and believe in possibilities. You will be responsible for designing scalable and secure cloud-based data architecture solutions. Additionally, you will lead data modeling, integration, and migration strategies across platforms. It will be essential to engage directly with clients to understand their business needs and translate them into technical solutions. Moreover, you will support sales/pre-sales teams with solution architecture, technical presentations, and proposals. Collaboration with cross-functional teams including engineering, BI, and product will also be a part of your role. Ensuring best practices in data governance, security, and performance optimization is a key responsibility. To be successful in this role, you must have strong experience with Cloud platforms such as AWS, Azure, or GCP. A deep understanding of Data Warehousing concepts and tools like Snowflake, Redshift, BigQuery, etc., is essential. Proven expertise in data modeling, including conceptual, logical, and physical modeling, is required. Excellent communication and client engagement skills are a must. Previous experience in pre-sales or solution consulting will be advantageous. You should also have the ability to present complex technical concepts to non-technical stakeholders effectively.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
At Medtronic, you can embark on a life-long career dedicated to exploration and innovation, all while contributing to the cause of advancing healthcare access and equity for all. Your role will be pivotal in leading with purpose to break down barriers to innovation in a more connected and compassionate world. As a PySpark Data Engineer at Medtronic's new Minimed India Hub, you will play a crucial part in designing, developing, and maintaining data pipelines using PySpark. Collaborating closely with data scientists, analysts, and other stakeholders, your responsibilities will revolve around ensuring the efficient processing and analysis of large datasets, managing complex transformations, and aggregations. This opportunity allows you to make a significant impact within Medtronic's Diabetes business. With the announcement of the intention to separate the Diabetes division to drive future growth and innovation, you will have the chance to operate with increased speed and agility. This move is expected to unlock potential and drive innovation to enhance the impact on patient care. Key Responsibilities: - Design, develop, and maintain scalable and efficient ETL pipelines using PySpark. - Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. - Implement data quality checks, ensure data integrity, and troubleshoot data pipeline issues. - Stay updated with the latest trends and technologies in big data and distributed computing. Required Knowledge and Experience: - Bachelor's degree in computer science, Engineering, or related field. - 4-5 years of experience in data engineering with a focus on PySpark. - Proficiency in Python and Spark, strong coding and debugging skills. - Strong knowledge of SQL and experience with relational databases. - Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. - Experience with data warehousing solutions like Redshift, Snowflake, Databricks, or Google BigQuery. - Familiarity with data lake architectures, big data technologies, and data storage solutions. - Excellent problem-solving skills and ability to troubleshoot complex issues. - Strong communication and collaboration skills. Preferred Skills: - Experience with Databricks and orchestration tools like Apache Airflow or AWS Step Functions. - Knowledge of machine learning workflows and data security best practices. - Familiarity with streaming data platforms, real-time data processing, and CI/CD pipelines. Medtronic offers a competitive Salary and flexible Benefits Package. The company values its employees and provides resources and compensation plans to support their growth at every career stage. This position is eligible for the Medtronic Incentive Plan (MIP). About Medtronic: Medtronic is a global healthcare technology leader committed to addressing the most challenging health problems facing humanity. With a mission to alleviate pain, restore health, and extend life, the company unites a team of over 95,000 passionate individuals who work tirelessly to generate real solutions for real people through engineering and innovation.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are a skilled Data Engineer with a strong background in Python, Snowflake, and AWS. Your primary responsibility will involve constructing and refining scalable data pipelines, integrating various data sources, and supporting analytics and business intelligence solutions within a cloud-based setting. An essential aspect of your role will entail the design and supervision of AWS Glue Jobs to facilitate efficient, serverless ETL workflows. Your key duties will revolve around designing and executing robust data pipelines using AWS Glue, Lambda, and Python. You will extensively collaborate with Snowflake for data warehousing, modeling, and analytics assistance. Managing ETL/ELT jobs using AWS Glue to ensure consistent data reliability will be a crucial part of your responsibilities. Furthermore, you will be tasked with migrating data between CRM systems, particularly from Snowflake to Salesforce, adhering to defined business protocols and ensuring data precision. It will also be your responsibility to optimize SQL/SOQL queries, manage large data volumes, and sustain high-performance levels. Additionally, implementing data normalization and quality checks will be essential to guarantee accurate, consistent, and deduplicated records. Your required skills include strong proficiency in Python, hands-on experience with Snowflake Data Warehouse, and familiarity with AWS services such as Glue, S3, Lambda, Redshift, and CloudWatch. You should have experience in ETL/ELT pipelines and data integration using AWS Glue Jobs, along with expertise in SQL and SOQL for data extraction and transformation. Moreover, an understanding of data modeling, normalization, and performance optimization is essential for this role. It would be advantageous if you have experience with Salesforce Data Loader, ETL mapping, and metadata-driven migration, as well as exposure to CI/CD tools, DevOps, and version control systems like Git. Previous work experience in Agile/Scrum environments will also be beneficial for this position.,
Posted 3 weeks ago
10.0 years
0 Lacs
Delhi, India
On-site
We are hiring for Senior Data Architect Experienced Required: 10 - 15 Years Location: Delhi Education: Graduation from Tier - 1 Colleges, preferred IIT Notice Period: Upto 30 Days Industry: B2B Product Companies with High data-traffic Client Name: Wingify Job Description: Role & Responsibilities: Lead and mentor a team of data engineers, ensuring high performance and career growth. Architect and optimize scalable data infrastructure, ensuring high availability and reliability. Drive the development and implementation of data governance frameworks and best practices. Work closely with cross-functional teams to define and execute a data roadmap. Optimize data processing workflows for performance and cost efficiency. Ensure data security, compliance, and quality across all data platforms. Foster a culture of innovation and technical excellence within the data team. Ideal Candidate: 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics. Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy and align it with business objectives. Strong leadership, communication, and stakeholder management skills. Preferred Qualifications: Experience in machine learning infrastructure or MLOps is a plus. Exposure to real-time data processing and analytics. Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company. Important Note: Kindly read the JOB DESCRIPTION carefully. Candidates below 10 YOE are nor eligible for this role. Candidates who are not rom TIER - 1 college, kindly DO NOT APPLY. Candidates who are not from B 2B Product Companies with High data-traffic, kindly DO NOT APPLY.
Posted 3 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Manager - Data Engineering Career Level - E Introduction To Role Join our Commercial IT Data Analytics & AI (DAAI) team as a Product Quality Leader, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities As the Product Quality Team Leader for data platforms, your key accountabilities will include leadership and mentorship, quality engineering standards, collaboration, technical expertise, and innovation and process improvement. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! Date Posted 15-Jul-2025 Closing Date 20-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France