Jobs
Interviews

6639 Databricks Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As an Angular FullStack Developer with 4 to 6 years of experience in Mumbai, working in a Hybrid mode, your daily responsibilities will involve collaborating with fellow developers, business analysts, and Product Owners. You will be engaged in researching, designing, implementing, testing, and assessing both new and existing software solutions. Additionally, you will be responsible for the maintenance and enhancement of current systems, development of monitoring tools, and documenting application processes for future reference. Exploring new technologies and contributing to the development and upkeep of automated deployment infrastructure will also be part of your role. Attending internal and external training sessions to enhance your technical knowledge and skills is vital for your professional growth. To excel in this position, you should have a solid understanding and at least 2 years of practical experience in middleware development using Java & Databases. Prior exposure to Scripting/Python will be advantageous. Familiarity with UI technologies like Angular and analytics tools such as Power BI will be beneficial. Proficiency in generic testing frameworks, adherence to good programming practices, and a preference for clean, maintainable, well-documented, and reusable code are essential. You should also be well-versed in CI/CD practices and possess a keen interest in ensuring rapid and secure code deployment through a stable, thoroughly tested, and risk-aware approach. A curious mindset and openness to acquiring knowledge of new languages and technologies are key attributes for success in this role. The ideal candidate for this position will have hands-on experience in Java, Angular, and Databricks, along with familiarity with Spark for managing large-scale data. Proficiency in writing and handling database queries using SQL, utilizing GIT for version control, writing tests, and employing testing tools are necessary skills. Experience with CI/CD pipelines for efficient and safe code deployment, comfort in Agile/Scrum environments, and the ability to produce clean, maintainable code are important qualities. A passion for learning new technologies and enhancing personal skills will set you apart as a valuable team member.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Technical Manager, you will lead and manage a team of software engineers to ensure high performance and quality delivery. Your responsibilities will include designing, developing, and maintaining scalable and robust Python applications. You will architect and implement cloud solutions on AWS and Azure, adhering to best practices in security, scalability, and cost-efficiency. Collaborating with cross-functional teams, you will define, design, and ship new features while mentoring and guiding team members in their technical and professional growth. In this role, you will implement DevOps practices to streamline CI/CD pipelines and automate deployment processes. Developing and maintaining APIs using FastAPI and GraphQL will be part of your tasks, ensuring that the team follows best practices in coding, testing, and documentation. You will also oversee database design, optimization, and maintenance, driving productivity within the team by implementing efficient workflows and leveraging code assist tools. To be successful in this position, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Technical Manager or in a similar role, leading large teams, is required. Strong proficiency in Python programming, extensive experience with AWS or Azure cloud services, and a solid understanding of DevOps practices and tools are essential. Experience with FastAPI and GraphQL, database systems (SQL and NoSQL), design patterns, and microservices architecture is also necessary. Additionally, familiarity with MLOps for deploying and managing machine learning models, LLMOps for large language model operations, Databricks for big data processing and analytics, and excellent problem-solving skills with attention to detail are desired. Strong communication and leadership skills, along with the ability to drive productivity and enhance team efficiency using code assist tools, will be valuable assets in this role.,

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Company Description NielsenIQ is a consumer intelligence company that delivers the Full View™, the world’s most complete and clear understanding of consumer buying behavior that reveals new pathways to growth. Since 1923, NIQ has moved measurement forward for industries and economies across the globe. We are putting the brightest and most dedicated minds together to accelerate progress. Our diversity brings out the best in each other so we can leave a lasting legacy on the work that we do and the people that we do it with. NielsenIQ offers a range of products and services that leverage Machine Learning and Artificial Intelligence to provide insights into consumer behavior and market trends. This position opens the opportunity to apply the latest state of the art in AI/ML and data science to global and key strategic projects. Job Description NielsenIQ’s Innovation Team is growing our AI capabilities and is now looking to hire an AI/ML Data Scientist in India (Pune). for the Core Models team, a multidisciplinary team of researchers working on different areas of AI such as recommender systems, extreme classifiers, Large Language Models (LLMs), among others. As part of this team, you will stay up to date with the latest research in AI (with special focus on NLP, but also on Computer Vision and other AI related fields), implement current state-of-the-art algorithms in real-world and large-scale challenging problems as well as proposing novel ideas. Your main focus will be creating high-quality datasets for training and fine-tuning Custom Models for the company, LLMs and Recommender Systems, and training them to analyze the impact of the different versions of the data on model’s performance. The selected candidate will be responsible for designing and implementing scalable data pipelines and strategies to support all stages of the R&D process, e.g., fine-tuning or alignment through reinforcement learning. The results of the word will be critical to ensure the robustness, safety, and alignment of our AI models. You will also have the opportunity to produce scientific content such as patents or conference/journal papers. Job Responsibilities: Investigate, develop, and apply data pipelines with minimal technical supervision, always ensuring a combination of simplicity, scalability, reproducibility and maintainability within the ML solutions and source code. Train Deep Learning models (Transformer models) and analyze the impact of different versions of the data. Perform feasibility studies and analyze data to determine the most appropriate solution. Drive innovation and proactively contribute to our work on custom Large Language Models. Be able to communicate results to tech and non-tech audiences. To work as a member of a team, encouraging team building, motivation and cultivating effective team relations. Qualifications Required Education, Skills and Experience: Master's degree in computer science or an equivalent numerate discipline. At least 5+ years’ experience with evidence in a related field. Strong background in computer science, linear algebra, probability. Solid experience in Machine Learning and Deep Learning (special focus on Transformers). Proven experience in Natural Language Processing and Large Language Models. Proven experience building scalable data pipelines and ETLs. Able to understand scientific papers and develop ideas into executable code. Proven track record of innovation in creating novel algorithms and publishing the results in AI conferences/journals. Languages and technologies: Python, SQL, PySpark, Databricks, Pandas/Polars, PyArrow, PyTorch, Huggingface, git. Proactive attitude, constructive, intellectual curiosity, and persistence to find answers to questions. A proficient level of interpersonal and communication skills (English level B2 minimum). Keen to work as part of a diverse team of international colleagues and in a global inclusive culture. Additional Information: Preferred Education, Skills and Experience: PhD in science (NLP/Data Science is preferred) and specialized courses in one of the above-mentioned fields. Experience working with large real-world datasets and scalable ML solutions. Previous experience in e-commerce, retail and/or FMCG/Consumer Insight business. Agile methodologies development (SCRUM or Scale Agile). Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Forward Deployed Engineer at Salesforce, you will play a crucial role in delivering transformative AI solutions to our strategic clients. Your responsibilities will include leading the design, development, and implementation of bespoke solutions using cutting-edge technologies like the Agentforce platform. You will be at the forefront of driving technical vision, mentoring team members, and ensuring the successful delivery of mission-critical AI applications in real-world environments. Your impact will be significant as you lead the architectural design of scalable production systems, strategize complex data ecosystems, drive innovation on the Agentforce platform, and operate with a proactive and strategic mindset. Building strong relationships with senior client teams, ensuring seamless deployment, and optimizing solutions for long-term reliability will be key aspects of your role. Additionally, you will act as a bridge between customer needs and product evolution, providing valuable feedback to shape future enhancements. To excel in this role, you are required to have a Bachelor's degree in Computer Science or a related field, with 5+ years of experience in delivering scalable production solutions. Proficiency in programming languages like JavaScript, Java, Python, and expertise in AI technologies are essential. Strong communication skills, a proactive attitude, and the ability to travel as needed are also important qualifications. Preferred qualifications include expert-level experience with Salesforce Data Cloud and the Agentforce platform, as well as knowledge of Salesforce CRM across various clouds. Experience in developing complex conversational AI solutions and Salesforce platform certifications would be advantageous. If you are passionate about leveraging AI to drive business transformation and have a track record of impactful delivery in agile environments, this role offers a unique opportunity to make a difference.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

Embark on a transformative journey as a Data Scientist AI/ML - AVP at Barclays in the Group Control Quantitative Analytics team, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. Group Control Quantitative Analytics (GCQA) is a global organization of highly specialized data scientists working on Artificial Intelligence, Machine Learning, and Gen AI model development and model management including governance and monitoring. GCQA is led by Remi Cuchillo under Lee Gregory, who is Chief Data and Analytics Officer (CDAO) in Group Control. GCQA is responsible for developing and managing AI/ML/GenAI models (including governance and regular model monitoring) and providing analytical support across different areas including Fraud, Financial Crime, Customer Due Diligence, Controls, Security, etc. within Barclays. The Data Scientist position provides project-specific leadership in building targeting solutions that integrate effectively into existing systems and processes while delivering strong and consistent performance. Working with GC CDAO team, the Quantitative Analytics Data Scientist role provides expertise in project design, predictive model development, validation, monitoring, tracking, and implementation. To be successful in this role, you should possess the following skillsets: Python Programming. Knowledge of Artificial Intelligence and Machine Learning algorithms including NLP. SQL. Spark/PySpark. Predictive Model development. Model lifecycle and model management including monitoring, governance, and implementation. DevOps tools like Git/Bitbucket etc. Project management using JIRA. Some other highly valued skills include: DevOps tools TeamCity, Jenkins, etc. Knowledge of Financial/Banking Domain. Knowledge of GenAI tools and working. AWS. Databricks. You may be assessed on the key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based in our Noida office. Purpose of the role To design, develop, implement, and support mathematical, statistical, and machine learning models and analytics used in business decision-making. Accountabilities Design analytics and modeling solutions to complex business problems using domain expertise. Collaboration with technology to specify any dependencies required for analytical solutions, such as data, development environments, and tools. Development of high performing, comprehensively documented analytics and modeling solutions, demonstrating their efficacy to business users and independent validation teams. Implementation of analytics and models in accurate, stable, well-tested software and work with technology to operationalize them. Provision of ongoing support for the continued effectiveness of analytics and modeling solutions to users. Demonstrate conformance to all Barclays Enterprise Risk Management Policies, particularly Model Risk Policy. Ensure all development activities are undertaken within the defined control environment. Assistant Vice President Expectations To advise and influence decision-making, contribute to policy development, and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well-developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialization to complete assignments. They will identify new directions for assignments and/or projects, identifying a combination of cross-functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires an understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Collaborate with other areas of work, for business-aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practices (in other areas, teams, companies, etc.) to solve problems creatively and effectively. Communicate complex information. "Complex" information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

Alphanext is a global talent solutions company with offices in London, Pune, and Indore. We connect top-tier technical talent with forward-thinking organizations to drive innovation and transformation through technology. We are seeking a Senior Data Integration Engineer to take charge of designing, building, and governing scalable, high-performance data pipelines across enterprise systems. The ideal candidate will have extensive experience in data engineering and integration, particularly within manufacturing, retail, and supply chain ecosystems. This role plays a crucial part in ensuring near-real-time data flows, robust data quality, and seamless integration among ERP, WMS, commerce, and finance platforms, thereby enabling AI and analytics capabilities throughout the enterprise. Key Responsibilities: - Designing and maintaining ELT/ETL pipelines that integrate systems such as BlueCherry ERP, Manhattan WMS, and Shopify Plus. - Developing event-driven architectures utilizing Azure Service Bus, Kafka, or Event Hubs for real-time data streaming. - Defining and publishing data contracts and schemas (JSON/Avro) in the enterprise Data Catalog to ensure lineage and governance. - Automating reconciliation processes with workflows that detect discrepancies, raise alerts, and monitor data-quality SLAs. - Leading code reviews, establishing integration playbooks, and providing guidance to onshore/offshore engineering teams. - Collaborating with the Cybersecurity team to implement encryption, PII masking, and audit-compliant data flows. - Facilitating AI and analytics pipelines, including feeds for feature stores and streaming ingestion to support demand forecasting and GenAI use cases. Year-One Deliverables: - Replacement of the existing nightly CSV-based exchange between BlueCherry and WMS with a near-real-time event bus integration. - Launching a unified product master API that feeds PLM, OMS, and e-commerce within 6 months. - Automating three-way reconciliation of PO packing list warehouse receipt to support traceability audits (e.g., BCI cotton). - Deployment of a data quality dashboard with rule-based alerts and SLA tracking metrics. Must-Have Technical Skills: - 5+ years of experience in data engineering or integration-focused roles. - Proficiency with at least two of the following: Azure Data Factory, Databricks, Kafka/Event Hubs, DBT, SQL Server, Logic Apps, Python. - Strong SQL skills and experience with a compiled or scripting language (Python, C#, or Java). - Proven track record of integrating ERP, WMS, PLM, or similar retail/manufacturing systems. - Expertise in data modeling, schema design (JSON/Avro), and schema versioning. - Working knowledge of CI/CD pipelines and infrastructure-as-code using tools like GitHub Actions and Azure DevOps. Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field (preferred). - Exceptional problem-solving skills, analytical mindset, and attention to data governance. - Strong communication and leadership abilities, with a history of mentoring and collaborating with teams.,

Posted 2 days ago

Apply

4.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Risk and Compliance professional at PwC, your primary focus will be on maintaining regulatory compliance and managing risks for clients. You will provide valuable advice and solutions to help organizations navigate complex regulatory landscapes and enhance their internal controls effectively. In the realm of enterprise risk management, your role will involve identifying and mitigating potential risks that could impact an organization's operations and objectives. Your responsibilities will include developing business strategies to manage risks efficiently within a rapidly changing business environment. Joining PwC's Acceleration Centers (ACs) presents an exciting opportunity to play a pivotal role in supporting various services offered by the Acceleration Center, including Advisory, Assurance, Tax, and Business Services. Within our innovative hubs, you will engage in challenging projects and deliver distinctive services to enhance client engagements through quality and innovation. Additionally, you will participate in dynamic training programs designed to enhance both your technical and professional skills. As part of the Power Platform team, you will take the lead in designing and developing innovative applications and automation solutions. In the role of a Senior Associate, you will mentor junior team members, ensuring adherence to established practices while leveraging your technical expertise to deliver exceptional solutions. This position offers you the chance to collaborate with cross-functional teams, enhance your problem-solving abilities, and contribute to the strategic direction of our Power Platform initiatives. Responsibilities: - Lead the design and implementation of innovative applications and automation solutions - Mentor junior team members to ensure adherence to established practices - Utilize technical knowledge to deliver exceptional solutions - Collaborate with cross-functional teams to enhance project outcomes - Apply critical thinking to solve complex problems - Contribute to the strategic direction of Power Platform initiatives - Foster a culture of continuous improvement and knowledge sharing - Maintain rigorous standards of quality and compliance in deliverables Requirements: - Bachelor's Degree - 4-9 years of experience - Oral and written proficiency in English required Desirable Skills: - Proven experience in Power Platform solutions - Advanced SQL knowledge - Designing and developing interactive Power BI dashboards - Building and maintaining semantic data models - Integrating data from various sources - Writing reusable Python scripts for automation - Collaborating effectively with cross-functional teams - Participation in Agile ceremonies and sprint planning - Familiarity with Azure ecosystems and Databricks Key Responsibilities: Solution Design & Delivery - Act as the technical lead in designing and developing applications and automation solutions using Power Apps, Power Automate, and related tools - Translate business requirements into scalable technical designs and solutions - Guide junior team members and ensure adherence to best practices in architecture and delivery - Provide expert consultation on Power Platform strategy, governance, and adoption - Hands-on experience with Power BI Technical Excellence - Ensure optimal performance, scalability, and maintainability of applications - Leverage advanced features such as custom connectors, Power Platform CLI, and Azure integrations - Stay updated with the Power Platform roadmap and innovations - Strong SQL knowledge Collaboration And Communication - Work closely with cross-functional teams to maintain quality throughout the software development lifecycle - Provide regular status updates and test results to stakeholders - Participate in daily stand-ups, sprint planning, and Agile ceremonies,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Data Engineering Architect at Iris Software, you will play a crucial role in leading enterprise-level data engineering projects on public cloud platforms like AWS, Azure, or GCP. Your responsibilities will include engaging with client managers to understand their business needs, conceptualizing solution options, and finalizing strategies with stakeholders. You will also be involved in team building, delivering Proof of Concepts (PoCs), and enhancing competencies within the organization. Your role will focus on building competencies in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, and Data Governance. Staying updated with the latest tools, best practices, and trends in the Data and Analytics field will be essential to drive innovation and excellence in your work. To excel in this position, you should hold a Bachelor's or Master's degree in a Software discipline and have extensive experience in Data architecture and implementing large-scale Data Lake/Data Warehousing solutions. Your background in Data Engineering should demonstrate leadership in solutioning, architecture, and successful project delivery. Strong communication skills in English, both written and verbal, are essential for effective collaboration with clients and team members. Proficiency in tools such as AWS Glue, Redshift, Azure Data Lake, Databricks, Snowflake, and databases, along with programming skills in Spark, Spark SQL, PySpark, and Python, are mandatory competencies for this role. Joining Iris Software offers a range of perks and benefits designed to support your financial, health, and overall well-being. From comprehensive health insurance and competitive salaries to flexible work arrangements and continuous learning opportunities, we are dedicated to providing a supportive and rewarding work environment where your success and happiness are valued. If you are inspired to grow your career in Data Engineering and thrive in a culture that values talent and personal growth, Iris Software is the place for you. Be part of a dynamic team where you can be valued, inspired, and encouraged to be your best professional and personal self.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

Adient is a leading global automotive seating supplier, supporting all major automakers in the differentiation of their vehicles through superior quality, technology, and performance. We are seeking a Sr. Data Analytics Lead to help build Adients data and analytics foundation, directly benefitting our internal business units, and our Consumers. You are self-motivated and data-curious, especially about how data can be used to optimize business opportunities. In this role you will own projects end-to-end, from conception to operationalization, demonstrating your comprehensive understanding of the full data product development lifecycle. You will employ various analytical techniques to solve complex problems, drive scalable cloud data architectures, and deliver data products to enhance decision making across the organization. In this role, you will also own the technical support for released applications being used by internal Adient teams. This includes the daily triage of problem tickets and change requests. You will have 2-3 developer direct reports to accommodate this support as well as new development. The successful candidate can lead medium to large scale analytics projects requiring minimal direction, is highly proficient in SQL and cloud-based technologies, has good communication skills, takes the initiative to explore and tackle problems, and is an effective people leader. The ideal candidate will be working within Adients Advanced Analytics team. You will be a part of an empowered, highly capable team collaborating with Business Relationship Managers, Product Owners, Data Engineers, Production Support, and Visualization Developers within multiple business units to understand the data analytics needs and translate those requirements into world-class solution architectures. You will lead and mentor a team of solution architects to research, analyze, implement, and support scalable data product solutions that power Adients analytics across the enterprise, and deliver on business priorities. Own technical support for released internal analytics applications. This includes the daily triage of problem tickets and change requests. Lead development and execution of reporting and analytics products to enable data-driven business decisions that will drive performance and lead to the accomplishment of annual goals. You will be leading, hiring, developing, and evolving the Analytics team and providing them technical direction with the support of other leads and architects. Understand the road ahead and ensure the team has the skills and tools necessary to succeed. Drive the team to develop operationally efficient analytic solutions. Manage resources/budget and partner with functional and business teams. Advocate sound software development practices and help develop and evangelize great engineering and organizational practices. You will be leading the team that designs and builds highly scalable data pipelines using new generation tools and technologies like Azure, Snowflake, Spark, Databricks, SQL, Python to induct data from various systems. Work with product owners to ensure priorities are understood and direct the team to support the vision of the larger Analytics organization. Translate complex business problem statements into analysis requirements and work with internal customers to define data product details based on expressed partner needs. Work closely with business and technical teams to deliver enterprise-grade datasets that are reliable, flexible, scalable, and provide low cost of ownership. Develop SQL queries and data visualizations to fulfill internal customer application reporting requirements, as well as ad-hoc analysis requests using tools such as PowerBI. Thoroughly document business requirements, data architecture solutions, and processes for business and technical audiences. Serve as a domain specialist on data and business processes within your area of focus and find solutions to operational or data issues in the data pipelines. Grow the technical ability of the team. QUALIFICATIONS - Bachelors Degree or Equivalent with 8+ years of experience in data engineering, computer science, or statistics field with at least 2+ years of experience in leadership/management. - Experience in developing Big Data cloud-based applications using the following technologies: SQL, Azure, Snowflake, PowerBI. - Experience building complex ADF data pipelines and Data Flows to ingest data from on-prem sources, transform, and sink into Snowflake. Good understanding of ADF pipelining Activities. - Familiar with various Azure connectors to establish on-prem data-source connectivity, as well as Snowflake data-warehouse connectivity over private network. - Lead/Work with hybrid teams, communicate effectively, both written and verbal, with technical and non-technical multi-functional teams. - Translate complex business requirements into scalable technical solutions meeting data warehousing design standards. Solid understanding of analytics needs and proactive-ness to build generic solutions to improve efficiency. - Experience with data visualization and dashboarding techniques to make complex data more accessible, understandable, and usable to drive business decisions and outcomes. Efficient in PowerBI. - Extensive experience in data architecture, defining and maintaining data assets, and developing data architecture strategies to support reporting and data visualization tools. - Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements. - Thrives in a dynamic environment, keeping composure and a positive attitude. - A plus if your experience was in distribution or manufacturing organizations. PREFERRED - Experience with Snowflake cloud data warehouse. - Experience with Azure PaaS services. - Experience with TSQL, SQL Server, Azure SQL, Snowflake SQL, Oracle SQL. - Experience with Azure Storage account connectivity. - Experience developing visualizations with PowerBI and BusinessObjects. - Experience with Databricks. - Experience with ADLS Gen2. - Experience with Azure VNet private endpoints on a private network. - Proficient with Spark and Python. - Advanced proficiency in SQL, joining multiple data sets across different data grains, query optimization, pivoting data. - MS Azure Certifications. - Snowflake Certifications. - Experience with other leading commercial Cloud platforms like AWS. - Experience with installing and configuring ODBC, JDBC drivers on Windows. - Candidate resides in the Plymouth MI area. PRIMARY LOCATION Pune Tech Center.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration techniques and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with programming languages such as Python or Scala. - Knowledge of data governance and security best practices. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 2 days ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Program Analyst Job #: req33911 Organization: World Bank Sector: General Services Grade: GE Term Duration: 2 years 0 months Recruitment Type: Local Recruitment Location: Chennai,India Required Language(s): English Preferred Language(s) Closing Date: 8/5/2025 (MM/DD/YYYY) at 11:59pm UTC Description Do you want to build a truly worthwhile career? Working at the World Bank Group provides a unique opportunity for you to help our clients solve their greatest development challenges. The World Bank Group is one of the largest sources of funding and knowledge for developing countries; a unique global partnership of five institutions dedicated to ending extreme poverty, increasing shared prosperity and promoting sustainable development. With 189 member countries and more than 120 offices worldwide, we work with public and private sector partners, investing in groundbreaking projects and using data, research, and technology to develop solutions to the most urgent global challenges. For more information, visit www.worldbank.org. Global Corporate Solutions Reporting to the Managing Director and World Bank Group Chief Administrative Officer, Global Corporate Solutions (GCS) brings together the functions of Corporate Security, Corporate Real Estate, and Corporate Services. About The Unit The Corporate Services (GCSCS) division within GCS provides services to the WBG in the areas of Travel and Visa Services; Food and Conference Services; Staff Services, including Commuter Services, Child Care, and Fitness Center; Mail and Shipping Services; the Art Program; Translation and Interpretation; Customer Service; Design and Publications; Printing; and Interactive Media. GCSCS also provides administrative oversight to the WBG Family Network and 1818 Society and is responsible for setting the policy framework and service standards, and for delivering services through a combination of staff and vendors at WBG headquarters (HQ) in Washington, DC and in Country Offices. To achieve its purpose, GCSCS is structured into three main units: (i) Travel and Client Services (GCSTC), (ii) Business Services (GCSBA), and (iii) Innovation and Client Solutions (GCSIS). GCSIS includes the GCS Service Desk and Processing & Analytics team in Chennai, India. Job Summary We are seeking a skilled and motivated Program Analyst to join our team in Chennai, India. Reporting to the Senior Program Manager, GCSIS, this role will support a small but dynamic data analytics team dedicated to supporting GCS and its clients. The ideal candidate will have expertise in analyzing large datasets, transforming complex data, and building insightful dashboards. This role will focus on data analysis, automation, and dashboard development using Power BI, Tableau, Power Automate, and other AI/ML tools. Strong analytical skills, attention to detail, and the ability to effectively communicate findings are essential for success in this position. If you’re a data-driven professional with a passion for problem-solving, we’d love to hear from you! Key Responsibilities Collaborate with stakeholders to understand reporting and analytical needs, translating business requirements into technical solutions. Extract, clean, and prepare data from multiple sources for analysis and reporting using Power Query and Tableau Prep Builder. Ensure data integrity, accuracy, and consistency through effective governance and quality checks. Analyze large datasets to identify trends, extract insights, and support business decision-making. Design, develop, and maintain interactive dashboards and reports using Power BI and Tableau. Present insights to stakeholders through clear and compelling visualizations and reports. Create and maintain documentation for dashboards, data sources, and automation workflows. Optimize and streamline reporting processes for efficiency and scalability. Automate workflows using Power Automate, enhancing efficiency across data-related processes. Work with Natural Language Processing (NLP) models to analyze unstructured text data. Build custom business applications using Power Apps. Apply Generative AI tools to support data analysis, automation, and reporting. Stay up-to-date with industry trends and best practices in data analytics and business intelligence. Selection Criteria Bachelor’s degree in Data Science, Computer Science, Business Analytics, Statistics, or a related field. Minimum 3+ years of experience in data analysis, reporting, or business intelligence roles. Proven expertise building dashboards and reports in Power BI and Tableau. Proficiency in M Code and DAX for data modeling and calculations. Advanced Excel skills, including Power Query, Power Pivot, complex formulas, and VBA (preferred). Hands-on experience with Power Automate or Zapier for workflow automation. Understanding of Generative AI and its applications in data analysis. Excellent problem-solving, analytical, and critical-thinking skills. Meticulous attention to detail and accuracy. Ability to work independently and take initiative. High level of personal motivation and eagerness to learn. Strong organizational skills with the ability to manage multiple tasks and deadlines. Excellent oral and written communication skills, capable of conveying complex issues concisely. Willingness to work in a schedule that overlaps with Washington, DC business hours. Preferred Qualifications Background in business intelligence, finance, or operations analytics. Experience with Power Apps. Experience applying Natural Language Processing (NLP) techniques to analyze unstructured text data (e.g., survey responses, emails, customer reviews). Familiarity with data warehousing platforms (e.g., Azure, AWS, Databricks, Snowflake). Proficiency with Python and R for data analysis and modeling. Knowledge of machine learning and AI-driven analytics. Prior experience working with cross-functional teams in a corporate setting. General Competencies Initiative - Volunteers to undertake tasks that stretch his or her capability. Flexibility - Demonstrates the ability to adapt plans, tasks and resources to meet objectives and/or work with others. Analytical Research and Writing - Able to undertake analytical research on topics requested by others. Shares findings with colleagues and other relevant parties. Client Orientation - Takes personal responsibility and accountability for timely response to client queries, requests or needs, working to remove obstacles that may impede execution or overall success. Drive for Results - Takes personal ownership and accountability to meet deadlines and achieve agreed- upon results and has the personal organization to do so. Teamwork, Collaboration and Inclusion - Collaborates with other team members and colleagues across units and contributes productively to the work and outputs of the team, as well as partners’ or stakeholders’, demonstrating respect for different points of view. Growth-mindset and Agile – Proactively action-oriented and outcome-focused. Proposes and implements strategic and practical adjustments to ensure optimal client service and maximum impact. Knowledge, Learning and Communication - Actively seeks the knowledge needed to complete assignments and shares knowledge with others, communicating and presenting information in a clear, accurate and organized manner with exceptional attention to detail. Business Judgment and Analytical Decision Making - Analyzes facts and data to support sound, logical decisions regarding own and others' work. WBG Culture Attributes Sense of Urgency – Anticipating and quickly reacting to the needs of internal and external stakeholders. Thoughtful Risk Taking – Taking informed and thoughtful risks and making courageous decisions to push boundaries for greater impact. Empowerment and Accountability – Engaging with others in an empowered and accountable manner for impactful results. World Bank Group Core Competencies The World Bank Group offers comprehensive benefits, including a retirement plan; medical, life and disability insurance; and paid leave, including parental leave, as well as reasonable accommodations for individuals with disabilities. We are proud to be an equal opportunity and inclusive employer with a dedicated and committed workforce, and do not discriminate based on gender, gender identity, religion, race, ethnicity, sexual orientation, or disability. Learn more about working at the World Bank and IFC , including our values and inspiring stories.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities: The Offshore Data Engineer plays a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. Working closely with onshore data architects and analysts, this role ensures high data quality, performance, and reliability across distributed systems. The engineer is expected to demonstrate technical proficiency, proactive problem-solving, and strong collaboration in a remote environment. -Design and develop robust ETL/ELT pipelines to ingest, transform, and load data from diverse sources. -Collaborate with onshore teams to understand business requirements and translate them into scalable data solutions. -Optimize data workflows through automation, parallel processing, and performance tuning. -Maintain and enhance data infrastructure including data lakes, data warehouses, and cloud platforms (AWS, Azure, GCP). -Ensure data integrity and consistency through validation, monitoring, and exception handling. -Contribute to data modeling efforts for both transactional and analytical use cases. -Deliver clean, well-documented datasets for reporting, analytics, and machine learning. -Proactively identify opportunities for cost optimization, governance, and process automation. Professional & Technical Skills: - Programming & Scripting: Proficiency in Databricks with SQL and Python for data manipulation and pipeline development. - Big Data Technologies: Experience with Spark, Hadoop, or similar distributed processing frameworks. -Workflow Orchestration: Hands-on experience with Airflow or equivalent scheduling tools. -Cloud Platforms: Strong working knowledge of cloud-native services (AWS Glue, Azure Data Factory, GCP Dataflow). -Data Modeling: Ability to design normalized and denormalized schemas for various use cases. -ETL/ELT Development: Proven experience in building scalable and maintainable data pipelines. -Monitoring & Validation: Familiarity with data quality frameworks and exception handling mechanisms. Good To have Skills -DevOps & CI/CD: Exposure to containerization (Docker), version control (Git), and deployment pipelines. -Data Governance: Understanding of metadata management, lineage tracking, and compliance standards. -Visualization Tools: Basic knowledge of BI tools like Power BI, Tableau, or Looker. -Machine Learning Support: Experience preparing datasets for ML models and feature engineering. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Roles and responsibilities: Design and implement data pipelines for supply chain data (e.g., inventory, shipping, procurement). Develop and maintain data warehouses and data lakes.  Ensure data quality, integrity, and security. Collaborate with supply chain stakeholders to identify analytics requirements. Develop data models and algorithms for predictive analytics (e.g., demand forecasting, supply chain optimization). Implement data visualization tools (e.g., Tableau, Power BI). Integrate data from various sources (e.g., ERP, PLMs, other data sources). Develop APIs for data exchange.  Work with cross-functional teams (e.g., supply chain, logistics, IT). Communicate technical concepts to non-technical stakeholders. Experience with machine learning algorithms & concepts Knowledge of data governance and compliance. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work in a fast-paced environment. Technical Skills: Bachelor's degree in Computer Science, Information Technology, or related field. 6-8 years of experience in data engineering. Proficiency in: Programming languages - Python, Java, SQL, Spark SQL. Data technologies - Hadoop, PySpark, NoSQL databases. Data visualization tools - Qliksense, Tableau, Power BI Cloud platforms - Azure Data Factory, Azure Databricks, AWS

Posted 2 days ago

Apply

2.0 - 10.0 years

0 Lacs

India

Remote

Pay Range: ₹400-500/hour Location: Remote (India) Mode: One-to-One Sessions Only (No batch teaching) We are hiring a Part-Time PySpark, Databricks Tutor who can deliver personalized, one-on-one online sessions to college and university-level students . The ideal candidate should have hands-on experience in big data technologies , particularly PySpark and Databricks , and should be comfortable teaching tools and techniques commonly used in the computer science and data engineering fields . Key Responsibilities: Deliver engaging one-to-one remote tutoring sessions focused on PySpark, Apache Spark, Databricks , and related tools. Teach practical use cases, project implementation techniques, and hands-on coding for real-world applications. Adapt teaching style based on individual student levels – beginners to advanced. Provide support with assignments, project work, and interview preparation. Ensure clarity in communication and foster an interactive learning environment. Required Skills & Qualifications: Experience: 2 to 10 years in the field of big data, data engineering, or related roles using PySpark and Databricks. Education: Bachelor’s or Master’s degree in Computer Science, Data Science, or relevant field. Strong English communication skills – both verbal and written. Familiarity with Spark SQL, Delta Lake, notebooks, and data pipelines. Ability to teach technical concepts with simplicity and clarity. Job Requirements: Freshers with strong knowledge and teaching ability may also apply. Must have a personal laptop and stable Wi-Fi connection . Must be serious and committed to long-term part-time work. Candidates who have applied before should not reapply . 💡 Note: This is a remote, part-time opportunity , and sessions will be conducted one-to-one , not in batch format. This role is ideal for professionals, freelancers, or educators passionate about sharing knowledge. 📩 Apply now only if you agree with the pay rate (₹400-500/hr) and meet the listed criteria. Let’s inspire the next generation of data engineers!

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Data Engineer, Chennai We’re seeking a highly motivated Data Engineer to join our agile, cross-functional team and drive end-to-end data pipeline development in a cloud-native, big data ecosystem. You’ll leverage ETL/ELT best practices and data lakehouse paradigms to deliver scalable solutions. Proficiency in SQL, Python, Spark, and modern data orchestration tools (e.g. Airflow) is essential, along with experience in CI/CD, DevOps, and containerized environments like Docker and Kubernetes. This is your opportunity to make an impact in a fast-paced, data-driven culture. Responsibilities Responsible for data pipeline development and maintenance Contribute to development, maintenance, testing strategy, design discussions, and operations of the team Participate in all aspects of agile software development including design, implementation, and deployment Responsible for the end-to-end lifecycle of new product features / components Ensuring application performance, uptime, and scale, maintaining high standards of code quality and thoughtful application design Work with a small, cross-functional team on products and features to drive growth Learning new tools, languages, workflows, and philosophies to grow Research and suggest new technologies for boosting the product Have an impact on product development by making important technical decisions, influencing the system architecture, development practices and more Qualifications Excellent team player with strong communication skills B.Sc. in Computer Sciences or similar 3-5 years of experience in Data Pipeline development 3-5 years of experience in PySpark / Databricks 3-5 years of experience in Python / Airflow Knowledge of OOP and design patterns Knowledge of server-side technologies such as Java, Spring Experience with Docker containers, Kubernetes and Cloud environments Expertise in testing methodologies (Unit-testing, TDD, mocking) Fluent with large scale SQL databases Good problem-solving and analysis abilities Requirements - Advantage Experience with Azure cloud services Experience with Agile Development methodologies Experience with Git Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Data Engineer] What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Ø Design, develop, and maintain data solutions for data generation, collection, and processing Ø Be a key team member that assists in design and development of the data pipeline Ø Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Ø Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ø Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Ø Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Ø Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Ø Implement data security and privacy measures to protect sensitive data Ø Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Ø Collaborate and communicate effectively with product teams Ø Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Ø Identify and resolve complex data-related challenges Ø Adhere to best practices for coding, testing, and designing reusable code/component Ø Explore new tools and technologies that will help to improve ETL platform performance Ø Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Master's degree / Bachelor's degree and 5 to 9 years Computer Science, IT or related field experience Functional Skills: Must-Have Skills Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 days ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Amgen Amgen harnesses the best of biology and technology to fight the world's toughest diseases, and make people's lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what fs known today. About The Role Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect From You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 4 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting As an Associate Data Engineer at Amgen, you will be involved in the development and maintenance of data infrastructure and solutions. You will collaborate with a team of data engineers to design and implement data pipelines, perform data analysis, and ensure data quality. Your strong technical skills, problem-solving abilities, and attention to detail will contribute to the effective management and utilization of data for insights and decision-making.

Posted 2 days ago

Apply

15.0 years

0 Lacs

India

On-site

Job Summary As part of the data leadership team, the Capability Lead – Databricks will be responsible for building, scaling, and delivering Databricks-based data and AI capabilities across the organization. This leadership role involves technical vision, solution architecture, team building, partnership development, and delivery excellence using Databricks Unified Analytics Platform across industries. The individual will collaborate with clients, alliance partners (Databricks, Azure, AWS), internal stakeholders, and sales teams to drive adoption of lakehouse architectures, data engineering best practices, and AI/ML modernization. Areas of Responsibility 1. Offering and Capability Development: Develop and enhance Snowflake-based data platform offerings and accelerators Define best practices, architectural standards, and reusable frameworks for Snowflake Collaborate with alliance teams to strengthen partnership with Snowflake 2. Technical Leadership: Provide architectural guidance for Snowflake solution design and implementation Lead solutioning efforts for proposals, RFIs, and RFPs involving Snowflake Conduct technical reviews and ensure adherence to design standards. Act as a technical escalation point for complex project challenges 3. Delivery Oversight: Support delivery teams with technical expertise across Snowflake projects Drive quality assurance, performance optimization, and project risk mitigation. Review project artifacts and ensure alignment with Snowflake best practices Foster a culture of continuous improvement and delivery excellence 4. Talent Development: Build and grow a high-performing Snowflake capability team. Define skill development pathways and certification goals for team members. Mentor architects, developers, and consultants on Snowflake technologies Drive community of practice initiatives to share knowledge and innovations 5. Business Development Support: Engage with sales and pre-sales teams to position Snowflake capabilities Contribute to account growth by identifying new Snowflake opportunities Participate in client presentations, workshops, and technical discussions 6. Thought Leadership and Innovation Build thought leadership through whitepapers, blogs, and webinars Stay updated with Snowflake product enhancements and industry trends This role is highly collaborative and will work extremely closely with cross functional teams to fulfill the above responsibilities. Job Requirements: 12–15 years of experience in data engineering, analytics, and AI/ML 3–5 years of strong hands-on experience with Databricks (on Azure, AWS, or GCP) Expertise in Spark (PySpark/Scala), Delta Lake, Unity Catalog, MLflow, and Databricks notebooks Experience designing and implementing Lakehouse architectures at scale Familiarity with data governance, security, and compliance frameworks (GDPR, HIPAA, etc.) Experience with real-time and batch data pipelines (Structured Streaming, Auto Loader, Kafka, etc.) Strong understanding of MLOps and AI/ML lifecycle management Certifications in Databricks (e.g., Databricks Certified Data Engineer Professional, ML Engineer Associate) are preferred Experience with hyperscaler ecosystems (Azure Data Lake, AWS S3, GCP GCS, ADF, Glue, etc.) Experience managing large, distributed teams and working with CXO-level stakeholders Strong problem-solving, analytical, and decision-making skills Excellent verbal, written, and client-facing communication

Posted 2 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: Tech Data Engineer Location: Hyderabad/Pune Experience: 6yrs Role Description This is a contract role for a Tech Data Engineer with 6 years of experience. The position is on-site and located in Hyderabad. The Tech Data Engineer will be responsible for managing data center operations, troubleshooting issues, cabling, and analyzing data. Daily tasks include ensuring data integrity, performing maintenance on data systems, and supporting the team with clear communication and problem-solving skills. • transform data into valuable insights that inform business decisions, making use of our internal data platforms and applying appropriate analytical techniques • design, model, develop, and improve data pipelines and data products • engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using data platform infrastructure effectively • develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems • ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements. • build observability into our solutions, monitor production health, help to resolve incidents, and remediate the root cause of risks and issues • understand, represent, and advocate for client needs 6+ years of Experience in • comprehensive understanding and ability to apply data engineering techniques, from event streaming and real-time analytics to computational grids and graph processing engines • curious to learn new technologies and practices, reuse strategic platforms and standards, evaluate options, and make decisions with long-term sustainability in mind • strong command of at least one language among Python, Java, Golang • understanding of data management and database technologies including SQL/NoSQL • understanding of data products, data structures and data manipulation techniques including classification, parsing, pattern matching • experience with Databricks, ADLS, Delta Lake/Tables, ETL tools would be an asset • good understanding of engineering practices and software development lifecycle • enthusiastic, self-motivated and client-focused • strong communicator, from making presentations to technical writing • bachelor’s degree in relevant discipline or equivalent experience Qualifications Strong Analytical Skills and Troubleshooting abilities Experience in Cabling and Data Center Operations Excellent Communication skills Ability to work effectively on-site in Hyderabad Relevant certifications such as Cisco Certified Network Associate (CCNA) or similar are a plus

Posted 2 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence. Azure Cloud Engineer Experience 5+ years of experience managing cloud infrastructure, preferably in Azure. Location: Indore/Pune Job Description We are seeking an experienced and proactive Azure Cloud Engineer to join our cloud infrastructure team. The ideal candidate will be responsible for designing, implementing, managing, and optimizing Azure cloud solutions, ensuring high availability, security, and performance of our cloud-based systems. This role will involve close collaboration with DevOps, security, application development, and operations teams. Key Duties And Tasks Design, deploy, and manage Azure infrastructure using best practices (IaaS, PaaS, containers, serverless). Implement and maintain Azure services such as VMs, VNets, Azure AD, Storage, AKS, App Services, Functions, Event Grid, Logic Apps, etc. Automate infrastructure provisioning using ARM templates, Bicep, or Terraform. Develop and manage CI/CD pipelines using Azure DevOps, GitHub Actions, or other DevOps tools. Ensure cloud security posture by implementing RBAC, NSGs, firewalls, policies, and identity protection. Monitor system performance, health, and costs using Azure Monitor, Log Analytics, and Cost Management. Troubleshoot and resolve issues related to cloud infrastructure and deployments. Stay current with Azure features and best practices and propose improvements or migrations as needed. Qualification And Skills Required 5+ years of experience managing cloud infrastructure, preferably in Azure. Strong Hands-on Experience With Azure Compute (VMs, Scale Sets, Functions) Azure Networking (VNet, Load Balancers, VPN Gateway, ExpressRoute) Azure Identity (Azure AD, RBAC, Managed Identities) Azure Storage and Databases Azure Kubernetes Service (AKS) or containers (Docker) Experience with infrastructure-as-code (Terraform, Bicep, or ARM templates). Knowledge of CI/CD and DevOps principles. Scripting in PowerShell, Bash, or Python. Familiarity with monitoring/logging tools like Azure Monitor, Application Insights, or Prometheus/Grafana. Experience with Git-based version control systems. Technical Skills Proven experience in security architecture and designing, building, and deploying secure cloud workloads. Expertise in IAC, Terraform, and scripting languages (Git, PowerShell, Terraform, Jenkins, Python, Bash). Experience in a DevOps environment with knowledge of Continuous Integration, Containers, and DAST/SAST tools. Strong knowledge of security technologies, identity and access management, and containerized security models. Experience with monitoring and alerting solutions for critical infrastructure. Good to have: Experience with distributed systems, Linux, CDNs, HTTP, TCP/IP basics, database and SQL skills, Rest API, microservices-based development, and automation experience with Kubernetes and Docker. Experience with hybrid cloud setups or migrations from on-prem to Azure. Familiarity with governance tools like Azure Policy, Blueprints, and Cost Management. Exposure to Microsoft Defender for Cloud or Sentinel for security monitoring. Experience with Databricks, Glue, Athena, EMR, Data Lake and related solutions and services. Certifications/Licenses Azure certifications such as AZ-104 (Azure Administrator), AZ-305 (Solutions Architect), or AZ-400 (DevOps Engineer). Education Bachelor's degree in Computer Science, Information Technology, or related field.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence. Cloud Engineer Experience 5 Years Location: Indore/Pune Job Description (Summary Of Responsibilities) Seeking a Cloud Engineer to design, deploy, and manage cloud infrastructure on Cloud while supporting development teams with scalable solutions. Primary experience on AWS is needed, and additionally other cloud experience on GCP/Azure is preferred. Key Duties Architectural Design: Lead the design and implementation process for AWS architectures, ensuring alignment with business goals and compliance with security standards. Collaborate with cross-functional teams to provide architectural guidance. Security Architecture: Utilize a security-first approach to design and implement robust security architectures for AWS solutions. Mitigate security risks and ensure the confidentiality, integrity, and availability of confidential data. Collaboration: Work closely with the cross functional teams, contributing to the security, development and optimization of cloud platforms. Collaborate on strategic initiatives, ensuring alignment with cloud strategy and best practices. Infrastructure as Code (IAC): Design, develop, and maintain scalable, resilient cloud-based infrastructure using an Infrastructure as Code (IAC) approach. Terraform/CloudFormation Expertise: Enhance and extend Terraform/CloudFormation configurations for efficient management of AWS resources. Scripting and Automation: Utilize expertise in Git, PowerShell, Terraform, Jenkins, Python, and Bash scripting to automate processes and enhance efficiency. DevOps Environment: Work within a DevOps environment, leveraging knowledge of Continuous Integration, Containers, and DAST/SAST tools. Security Technologies: Apply broad knowledge of security technologies landscape, emphasizing identity and access management, application and data security, and containerized security models. Monitoring and Alerting Solutions: Implement and optimize monitoring and alerting solutions for critical infrastructure. Contribution to Platform Architecture: Actively contribute to platform architecture, design discussions, and security initiatives. Qualifications And Skills Required 5 years of Multi Cloud experience with core services. Kubernetes/Docker and networking knowledge and experience Proficiency in Terraform and scripting (Python/Bash) Experience with CI/CD tools and cloud migrations Experience with Github Education Bachelor's degree in Computer Science, Information Technology, or related field. Certifications/Licenses AWS Solution Architect Technical Skills Proven experience in security architecture and a minimum of 5 years in designing, building, and deploying secure cloud workloads. Expertise in IAC, Terraform/CloudFormation, and scripting languages (Git, PowerShell, Terraform, Jenkins, Python, Bash). Experience in a DevOps environment with knowledge of Continuous Integration, Containers, and DAST/SAST tools. Strong knowledge of security technologies, identity and access management, and containerized security models. Experience with monitoring and alerting solutions for critical infrastructure. Good to have: Experience with distributed systems, Linux, CDNs, HTTP, TCP/IP basics, database and SQL skills, Rest API, microservices-based development, and automation experience with Kubernetes and Docker. Experience with Databricks, Glue, Athena, EMR, Data Lake and related solutions and services.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence. Cloud Engineer Experience 5 Years Location: Indore/Pune Job Description (Summary Of Responsibilities) Seeking a Cloud Engineer to design, deploy, and manage cloud infrastructure on Cloud while supporting development teams with scalable solutions. Primary experience on AWS is needed, and additionally other cloud experience on GCP/Azure is preferred. Key Duties Architectural Design: Lead the design and implementation process for AWS architectures, ensuring alignment with business goals and compliance with security standards. Collaborate with cross-functional teams to provide architectural guidance. Security Architecture: Utilize a security-first approach to design and implement robust security architectures for AWS solutions. Mitigate security risks and ensure the confidentiality, integrity, and availability of confidential data. Collaboration: Work closely with the cross functional teams, contributing to the security, development and optimization of cloud platforms. Collaborate on strategic initiatives, ensuring alignment with cloud strategy and best practices. Infrastructure as Code (IAC): Design, develop, and maintain scalable, resilient cloud-based infrastructure using an Infrastructure as Code (IAC) approach. Terraform/CloudFormation Expertise: Enhance and extend Terraform/CloudFormation configurations for efficient management of AWS resources. Scripting and Automation: Utilize expertise in Git, PowerShell, Terraform, Jenkins, Python, and Bash scripting to automate processes and enhance efficiency. DevOps Environment: Work within a DevOps environment, leveraging knowledge of Continuous Integration, Containers, and DAST/SAST tools. Security Technologies: Apply broad knowledge of security technologies landscape, emphasizing identity and access management, application and data security, and containerized security models. Monitoring and Alerting Solutions: Implement and optimize monitoring and alerting solutions for critical infrastructure. Contribution to Platform Architecture: Actively contribute to platform architecture, design discussions, and security initiatives. Qualifications And Skills Required 5 years of Multi Cloud experience with core services. Kubernetes/Docker and networking knowledge and experience Proficiency in Terraform and scripting (Python/Bash) Experience with CI/CD tools and cloud migrations Experience with Github Education Bachelor's degree in Computer Science, Information Technology, or related field. Certifications/Licenses AWS Solution Architect Technical Skills Proven experience in security architecture and a minimum of 5 years in designing, building, and deploying secure cloud workloads. Expertise in IAC, Terraform/CloudFormation, and scripting languages (Git, PowerShell, Terraform, Jenkins, Python, Bash). Experience in a DevOps environment with knowledge of Continuous Integration, Containers, and DAST/SAST tools. Strong knowledge of security technologies, identity and access management, and containerized security models. Experience with monitoring and alerting solutions for critical infrastructure. Good to have: Experience with distributed systems, Linux, CDNs, HTTP, TCP/IP basics, database and SQL skills, Rest API, microservices-based development, and automation experience with Kubernetes and Docker. Experience with Databricks, Glue, Athena, EMR, Data Lake and related solutions and services.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role This position will join the Enterprise Data and AI team that supports all brands in the Warner Bros umbrella including WB films in theatrical and home entertainment, DC studios, Consumer Products, games, etc. The ideal candidate is a subject matter expert in data science with exposure to predictive modeling, forecasting, recommendation engines and data analytics. This person will build data pipelines, apply statistical modeling and machine learning, and deliver meaningful insights about customers, products and business strategy WBD to drive data-based decisions Responsibilities As a Staff Data Scientist, you will play a critical role in advancing data-driven solutions to complex business challenges, influencing data strategy efforts for WBD Businesses. The responsibilities include: Analyze complex, high volumes of data from various sources using various tools and data analytics techniques. Partner with stakeholders to understand business questions and provide answers using the most appropriate mathematical techniques. Model Development and Implementation: Design, develop, and implement statistical models, predictive models and machine learning algorithms that inform strategic decisions across various business units. Exploratory Data Analysis: Utilize exploratory data analysis techniques to identify and investigate new opportunities through innovative analytical and engineering methods. Advanced Analytics Solutions: Collaborate with Product and Business stakeholders to understand business challenges and develop sophisticated analytical solutions. Data Automation: Advance automation initiatives that reduce the time spent on data preparation, enabling more focus on strategic analysis. Innovative Frameworks Construction: Develop and enhance frameworks that improve productivity and are intuitive for adoption across other data teams and be abreast with innovative machine learning techniques (e.g., deep learning, reinforcement learning, ensemble methods) and emerging AI technologies to stay ahead of industry trends. Collaborate with data engineering teams to architect and scale robust, efficient data pipelines capable of handling large, complex datasets, ensuring the smooth and automated flow of data from raw collection to insights generation. Deployment of machine learning models into production environments, collaborating with DevOps and engineering teams to ensure smooth integration and scalability. Quality Assurance: Implement robust systems to detect, alert, and rectify data anomalies. Qualifications & Experiences Bachelor’s degree, MS, or greater in Computer/Data Science, Engineering, Mathematics, Statistics, or related quantitative discipline. 8+ years relevant experience in Data Science. Expertise in a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, random forests, deep learning etc.) and experience with applications of these techniques. Expertise in advanced statistical techniques and concepts (regressions, statistical tests etc.) and experience with application of these tools. A demonstrated track record of utilizing data science to solve business problems in a professional environment. Expertise in SQL and either Python or R, including experience with application deployment packages like R Streamlit or Shiny. Experience with database technologies such as Databricks, Snowflake, and others. Familiarity with BI tools (Power BI, Looker, Tableau) and experience managing workflows in an Agile environment. Strong analytical and problem-solving abilities. Excellent communication skills to effectively convey complex data-driven insights to stakeholders. High attention to detail and capability to work independently in managing multiple priorities under tight deadlines. Proficiency in big data technologies (e.g., Spark, Kafka, Hive). Experience working in a cloud environment (AWS, Azure, GCP) to facilitate data solutions. Ability to collaborate effectively with business partners and develop and maintain productive professional relationships. Experience with adhering to established data management practices and standards. Ability to communicate to all levels of business, prioritize and manage assignments to meet deadlines and establish strong relationships. Interest in movies, games, and comics is a plus. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary 8 10 years of experience as an Azure Data Engineer with expertise in Databricks and Azure Data Factory. Programming expertise in SQL, Spark and Python is mandatory 2+ years of experience with medical claims in healthcare and/or managed care is required Expertise in developing ETL/ELT pipelines for BI/ data visualization. Familiarity with normalized, dimensional, star schema and snowflake schematic models are mandatory Prior experience using version control to manage code changes

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies