Home
Jobs

3051 Databricks Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Senior Analyst – Data Engineer - Deloitte Technology - Deloitte Support Services India Private Limited Do you thrive on developing creative and innovative insights to solve complex challenges? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with premier thought leaders in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Work you’ll do Seeking a candidate with extensive experience on designing, delivering and maintaining implementations of solutions in the cloud, specifically Microsoft Azure. This candidate should also possess strong cross-discipline communication skills, strong analytical aptitude with critical thinking, a solid understanding of how data would translate into reporting / dashboarding capabilities, and the tools and platforms that support them. Responsibilities Role Specific Designing a well-structured data model using methodologies (e.g., Kimball or Inmon) that accurately represents the business requirements, ensures data integrity and minimizes redundancies. Developing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into Azure data services. This includes using Azure Data Factory, Azure Databricks, or other tools to orchestrate data workflows and data movement. Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of Enterprise Data & Analytics. Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture, as well contributes to the development of the broader Enterprise Data & Analytics Engineering community Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from Enterprise Data & Analytics with the Data Insights team. Handle break fixes and participate in a rotational on-call schedule. On-call includes monitoring of scheduled jobs and ETL pipelines. Actively participate in team meetings to transparently review the status of in-flight projects and their progress. Follow standard practice and frameworks on each project from development, to testing and then productionizing, each within the appropriate environment laid out by Data Architecture. Challenge’s self and others to make an impact that matters and help team connect their contributions with broader purpose. Sets expectations to the team, aligns the work based on the strengths and competencies, and challenges them to raise the bar while providing the support. Extensive knowledge of multiple technologies, tools, and processes to improve the design and architecture of the assigned applications. Knowledge Sharing / Documentation Contribute to, produce, and maintain processes, procedures, operational and architectural documentation. Change Control - ensure compliance with Processes and adherence to standards and documentation. Work with Deloitte Technology leadership and service teams in reviewing documentation and aligning KPIs to critical steps in our service operations. Active participation in ongoing training within BI space. The team At Deloitte, we’re all about collaboration. And nowhere is this more apparent than among our 2,000-strong internal services team. With our combined specialist skills, we provide all the essential support and advice our client-facing colleagues need, right across the firm. This enables them to focus all of their efforts on delivering the best service possible to their clients. Covering seven distinct areas; Human Resources, Clients & Industries, Finance & Legal, Practice Support Services, Quality & Risk Services, IT Services, and Workplace Services & Real Estate, together we live, breathe and deliver the Deloitte experience. Location: Hyderabad Work shift Timings: 11 AM to 8 PM Qualifications Bachelor of Engineering/ Bachelor of Technology 3-6 years of broad-based IT experience with technical knowledge of Microsoft SQL Server, Azure SQL Data Warehouse, Azure Data Lake Store, Azure Data Factory Demonstrated experience in Apache Framework (Spark, Scala, etc.) Well versed in SQL and comfortable in scripting using Python or similar language. First Month Critical Outcomes: Absorb strategic projects from the backlog and complete the related Azure SQL Data Warehouse Development work. Inspect existing run-state SQL Server databases and Azure SQL Data Warehouses and identify optimizations for potential development. Deliver new databases assigned as needed. Integration to on-call rotation (First 90 days). Contribute to legacy content and architecture migration to data lake (First 90 days). Delivery of first 2 data ingestion pipelines to include ingestion, QA and automation using Azure Big Data tools (First 90 days). Ability to document all work following standard documentation practices set forth by Data Governance (First 90 days). How you’ll grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. #EAG-Technology Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304653

Posted 4 days ago

Apply

7.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Purpose Client calls, guide towards optimized, cloud-native architectures, future state of their data platform, strategic recommendations and Microsoft Fabric integration. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders AZ Data Platform Expertise: Synapse, Databricks, Azure Data Factory (ADF), Azure SQL (DW/DB), Power BI (PBI). Define modernization roadmaps and target architecture. Strong understanding of data governance best practices for data quality, Cataloguing, and lineage. Proven ability to lead client engagements and present complex findings. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Lead all interviews & workshops to capture current/future needs. Direct the technical review of Azure (AZ) infrastructure (Databricks, Synapse Analytics, Power BI) and critical on-premises (on-prem) systems. Come up with architecture designs (Arch. Designs), focusing on refined processing strategies and Microsoft Fabric. Understand and refine the Data Governance (Data Gov.) roadmap, including data cataloguing (Data Cat.), lineage, and quality. Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 4 days ago

Apply

8.0 years

0 Lacs

Pune

On-site

GlassDoor logo

Company Description About Hitachi Solutions India Pvt Ltd: Hitachi Solutions, Ltd., headquartered in Tokyo, Japan, is a core member of Information & Telecommunication Systems Company of Hitachi Group and a recognized leader in delivering proven business and IT strategies and solutions to companies across many industries. The company provides value-driven services throughout the IT life cycle from systems planning to systems integration, operation and maintenance. Hitachi Solutions delivers products and services of superior value to customers worldwide through key subsidiaries in the United States, Europe, China and India. The flagship company in the Hitachi Group's information and communication system solutions business, Hitachi Solutions also offers solutions for social innovation such as smart cities. Our Competitive Edge We work together in a dynamic and rewarding work environment. We have an experienced leadership team, excellent technology and product expertise, and strong relationships with a broad base of customers and partners. We offer competitive compensation and benefits package, regular performance review, performance bonuses, and regular trainings. What is it like working here? We pride ourselves on being industry leaders and providing an enjoyable work environment where our people can grow personally and professionally. Hitachi is the place people can develop skills they’re excited about. The following are our commitments to employees. We recognize our profitability and project success comes from our team—great people doing great things. As such, we pursue profitable growth and expanded opportunities for our team. We offer challenging and diverse work across multiple industries and reward creativity and entrepreneurial innovation. We respect, encourage, and support each individual needs to continually learn and grow personally and professionally. We are committed to fostering our people. We listen. Every employee has something important to say that can contribute to enriching our environment. We compensate fairly. And while employees might come for the paycheck, they stay for the people. Our people are the reason we are exceptional. This is something we never forget. Job Description Power BI Architects are experts in data modeling and analysis and are responsible for developing high-quality datasets and visually stunning reports. They design and develop data models that effectively support business requirements, ensuring the accuracy and reliability of the data presented in the dashboards and reports. They possess proficiency in Power BI Desktop and expertise with SQL and DAX. Projects may range from short-term individual client engagements to multiyear delivery engagements with large, blended teams. Requirements: A minimum of 8 years full-time experience using Power BI Desktop, with extensive knowledge of Power Query, Power Pivot, and Power View Able to quickly write SQL for database querying and DAX for creating custom calculations Possess good knowledge of M and Vertipaq Analyzer Understand data modeling concepts and be able to create effective data models to support reporting needs. Perform data ETL processes to ensure that data sets are clean, accurate, and ready for analysis. Work closely with stakeholders to understand requirements, deliver solutions that meet those needs, and bridge the gap between technical and non-technical sides. Unwavering ability to quickly propose solutions by recalling the latest best practices learned from MVP & Product Team articles, MSFT documentation, whitepapers, and community publications Excellent communication, presentation, influencing, and reasoning skills Familiarity with the Azure data platform, e.g., ADLS, SQL Server, ADF, Databricks etc. We would like to see a blend of the following technical skills: Power BI Desktop, Power BI Dataflows, Tabular Editor, DAX Studio, and VertiPaq Analyzer T-SQL, DAX, M, and PowerShell Power BI Service architecture design and administration Understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights VertiPaq and MashUp engine knowledge Data modeling using the Kimball methodology Qualifications Good verbal and written communication. Educational Qualification: BE/MCA/ Any Graduation. Additional Information Beware of scams Our recruiting team may communicate with candidates via our @hitachisolutions.com domain email address and/or via our SmartRecruiters (Applicant Tracking System) notification@smartrecruiters.com domain email address regarding your application and interview requests. All offers will originate from our @hitachisolutions.com domain email address. If you receive an offer or information from someone purporting to be an employee of Hitachi Solutions from any other domain, it may not be legitimate.

Posted 4 days ago

Apply

4.0 - 15.0 years

29 - 30 Lacs

Bengaluru

On-site

GlassDoor logo

Post Azure Engineer Location : Bangalore (Hybrid) Experience : 4 to 15 Years No.of Positions: 8 TYPE: FTE Budget : 30 LPA Notice Period : Immediate to 30 Days Joiner Job Description:  Strong Azure Platform Knowledge On Data & AI Service Portfolio.  Knowledge and Skills to know how services work and what level of permissions are required. How to Design Access Patterns for Services, Users, Managed Identities to integrate service in a Complex Architecture, Optimize the Access model, without redundant permissions.  Knowledge on Azure Identity & Access Management & Security  Knowledge on Azure's resource management and quota system  Azure Landing Zone Design Concepts  Working Knowledge with ARM Cli, SDK using Bash/ Powershell/ Python  Knowledge on Azure CI-CD, Devops  Knowledge on MS Purview and Databricks Unity Catalog Job Types: Full-time, Permanent Pay: ₹2,900,000.00 - ₹3,000,000.00 per year Benefits: Provident Fund Schedule: Day shift Work Location: In person

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations Responisibilities: Databricks Engineers Requirements: Total Experience: 5-8 years with 4+ years of relevant experience Skills: Proficiency on Databricks platform Strong hands-on experience with Pyspark , SQL, and Python Any cloud - Azure, AWS, GCP Certifications (Any of the following): Databricks Certified Associate Developer for Spark 3.0 - Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Location: Bangalore Mandatory Skill Sets Databricks, Pyspark, SQL,Python, Any cloud - Azure, AWS, GCP Preferred Skill Sets Related CeCeritfication - •Databricks Certified Associate Developer for Spark 3.0 - Preferred •Databricks Certified Data Engineer Associate •Databricks Certified Data Engineer Professional Year of Experience required 5 to 8 years Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

2.0 years

10 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: Data Estate(DE) Job Category: Engineering & Technology Experience Level: Experienced Hire At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are-with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Skills and Competencies Proficiency in Kubernetes and Amazon EKS (2+ years required): Essential for managing containerized applications and ensuring high availability and security in cloud-native environments. Strong expertise in AWS serverless technologies (required): Including Lambda, API Gateway, EventBridge, and Step Functions, to build scalable and cost-efficient solutions. Hands-on experience with Terraform (2+ years required): Critical for managing Infrastructure as Code (IaC) across multiple environments, ensuring consistency and repeatability. CI/CD pipeline development using GitHub Actions (required): Necessary for automating deployments and supporting agile development practices. Scripting skills in Python, Bash, or PowerShell (required): Enables automation of operational tasks and enhances infrastructure management capabilities. Experience with Databricks and Apache Kafka (preferred): Valuable for teams working with data pipelines, MLOps workflows, and event-driven architectures. Education Bachelor’s degree in Computer Science or equivalent experience Responsibilities Design, automate, and manage scalable cloud infrastructure using Kubernetes, AWS, Terraform, and CI/CD pipelines . Design and manage cloud-native infrastructure using container orchestration platforms, ensuring high availability, scalability, and security across environments. Implement and maintain Infrastructure as Code (IaC) using tools like Terraform to provision and manage multi-environment cloud resources consistently and efficiently. Develop and optimize continuous integration and delivery (CI/CD) pipelines to automate application and infrastructure deployments, supporting agile development cycles. Monitor system performance and reliability by configuring observability tools for logging, alerting, and metrics collection, and proactively address operational issues. Collaborate with cross-functional teams to align infrastructure solutions with application requirements, ensuring seamless deployment and performance optimization. Document technical processes and architectural decisions through runbooks, diagrams, and knowledge-sharing resources to support operational continuity and team onboarding. About the team Our Data Estate DevOps team is responsible for enabling the scalable, secure, and automated infrastructure that powers Moody’s enterprise data platform. We ensure the seamless deployment, monitoring, and performance of data pipelines and services that deliver curated, high-quality data to internal and external consumers. We contribute to Moody’s by: Accelerating data delivery and operational efficiency through automation, observability, and infrastructure-as-code practices that support near real-time data processing and remediation. Supporting data integrity and governance by enabling traceable, auditable, and resilient systems that align with regulatory compliance and GenAI readiness. Empowering innovation and analytics by maintaining a modular, interoperable platform that integrates internal and third-party data sources for downstream research models, client workflows, and product applications. By joining our team, you will be part of exciting work in cloud-native DevOps, data engineering, and platform automation, supporting global data operations across 29 countries and contributing to Moody’s mission of delivering integrated perspectives on risk and growth. Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law. Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary.

Posted 4 days ago

Apply

3.0 years

7 - 9 Lacs

Bengaluru

On-site

GlassDoor logo

By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description The Future Begins Here At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. THE OPPORTUNITY: As a Data Engineer, you will be building and maintaining data systems and construct datasets that are easy to analyze and support Business Intelligence requirements as well as downstream systems. Responsibilities: Develops and maintains scalable data pipelines and builds out new integrations using AWS native technologies to support continuing increases in data source, volume, and complexity. Collaborates with analytics and business teams to improve data models that feed business intelligence tools and dashboards, increasing data accessibility and fostering data-driven decision making across the organization. Implements processes and systems to drive data reconciliation, monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes that depend on it. Writes unit/integration/performance test scripts, contributes to engineering wiki, and documents work. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Works closely with a team of frontend and backend engineers, product managers, and analysts. Works with DevOps and Cloud Center of Excellence to deploy data pipeline solutions in Takeda AWS environments meeting security and performance requirements. Skills and Qualifications Bachelors’ Degree, from an accredited institution in Engineering, Computer Science, or related field. 3+ years of experience in software, data, data warehouse, data lake, and analytics reporting development. Build and fine-tune GenAI-powered solutions using LLMs Develop retrieval-augmented generation (RAG) pipelines integrating vector stores Strong experience in data/Big Data, data integration, data model, modern database (Graph, SQL, No-SQL, etc.) query languages and AWS cloud technologies including DMS, Lambda, Databricks, SQS, Step Functions, Data Streaming, Visualization, etc. Solid experience in DBA, dimensional modeling, SQL optimization - Aurora is preferred. Experience designing, building, maintaining data integrations using SOAP/REST web services/API, as well as schema design and dimensional data modeling. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. WHAT TAKEDA ICC INDIA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are: Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Employee Assistance Program Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks), Maternity Leave (up to 26 weeks), Bereavement Leave (5 days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. #Li-Hybrid Locations IND - Bengaluru Worker Type Employee Worker Sub-Type Regular Time Type Full time

Posted 4 days ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Chennai

On-site

GlassDoor logo

CDM Smith is seeking a Data Engineer to join our Digital Engineering Solutions team. This individual will be part of the Data Technology group within the Digital Engineering Solutions team, helping to drive strategic Architecture, Engineering and Construction (AEC) initiatives using cutting-edge data technologies and analytics to deliver actionable business insights and robust solutions for AEC professionals and client outcomes. The Data Technology group will lead the firm in AEC-focused Business Intelligence and data services by providing architectural guidance, technological vision, and solution development. The Data Technology group will specifically utilize advanced analytics, data science, and AI/ML to give our business and our products a competitive advantage. It includes understanding and managing the data, how it interconnects, and architecting & engineering data for self-serve BI and BA opportunities. This position is for a person who has demonstrated excellence in data engineering capabilities, experienced with data technology and processes, and enjoys framing a problem, shaping and creating solutions, and helping to lead and champion implementation. As a member of the Digital Engineering Solutions team, the Data Technology group will also engage in research and development and provide guidance and oversight to the AEC practices at CDM Smith, engaging in new product research, testing, and the incubation of data technology-related ideas that arise from around the company. Key Responsibilities: Assists in the design, development, and maintenance of scalable data pipelines and workflows to extract, transform, and load (ETL/ELT) data from various sources into target systems. Contributes to automate workflows to ensure efficiency, scalability, and error reduction in data integration processes. Tests data quality and integrity by implementing processes to validate completeness, accuracy, and consistency of data. Monitor and troubleshoot data pipeline performance and reliability. Document data engineering processes and workflows. Collaborate with Data Scientists, Analytics Engineers, and stakeholders to understand business requirements and deliver high-quality data solutions. Stay abreast of the latest developments and advancements, including new and emerging technologies & best practices and new tools & software applications and how they could impact CDM Smith. Assist with the development of documentation, standards, best practices, and workflows for data technology hardware/software in use across the business. Perform other duties as required. Skills and Abilities: Good foundation with the Software Development Life Cycle (SDLC) and Agile Development methodologies. Good foundation with Cloud ETL/ELT tools and deployment, including Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Good Knowledge in data modeling and designing scalable ETL/ELT processes. Familiarity with CI/CD pipelines and DevOps practices for data solutions. Knowledge of monitoring tools and techniques for ensuring pipeline observability and reliability. Excellent problem-solving and critical thinking skills to identify and address technical challenges effectively. Strong critical thinking skills to generate innovative solutions and improve business processes. Ability to effectively communicate complex technical concepts to both technical and non-technical audiences. Detail oriented with the ability to assist with executing highly complex or specialized projects. Minimum Qualifications Bachelor’s degree. 0 – 2 years of related experience. Equivalent additional directly related experience will be considered in lieu of a degree. Amount of Travel Required 0% Background Check and Drug Testing Information CDM Smith Inc. and its divisions and subsidiaries (hereafter collectively referred to as “CDM Smith”) reserves the right to require background checks including criminal, employment, education, licensure, etc. as well as credit and motor vehicle when applicable for certain positions. In addition, CDM Smith may conduct drug testing for designated positions. Background checks are conducted after an offer of employment has been made in the United States. The timing of when background checks will be conducted on candidates for positions outside the United States will vary based on country statutory law but in no case, will the background check precede an interview. CDM Smith will conduct interviews of qualified individuals prior to requesting a criminal background check, and no job application submitted prior to such interview shall inquire into an applicant's criminal history. If this position is subject to a background check for any convictions related to its responsibilities and requirements, employment will be contingent upon successful completion of a background investigation including criminal history. Criminal history will not automatically disqualify a candidate. In addition, during employment individuals may be required by CDM Smith or a CDM Smith client to successfully complete additional background checks, including motor vehicle record as well as drug testing. Agency Disclaimer All vendors must have a signed CDM Smith Placement Agreement from the CDM Smith Recruitment Center Manager to receive payment for your placement. Verbal or written commitments from any other member of the CDM Smith staff will not be considered binding terms. All unsolicited resumes sent to CDM Smith and any resume submitted to any employee outside of CDM Smith Recruiting Center Team (RCT) will be considered property of CDM Smith. CDM Smith will not be held liable to pay a placement fee. Business Unit COR Group COR Assignment Category Fulltime-Regular Employment Type Regular

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment. Show more Show less

Posted 4 days ago

Apply

0 years

5 - 8 Lacs

Indore

On-site

GlassDoor logo

AV-230749 Indore,Madhya Pradesh,India Full-time Permanent Global Business Services DHL INFORMATION SERVICES (INDIA) LLP Your IT Future, Delivered Senior Software Engineer (Azure BI) Open to all PAN India candidates. With a global team of 5800 IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. Our offices in Cyberjaya, Prague, and Chennai have earned #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experiences. Digitalization. Simply delivered. At IT Services, we are passionate about Azure Databricks and PySpark. Our PnP BI Solutions team is continuously expanding. No matter your level of Software Engineer Azure BI proficiency, you can always grow within our diverse environment. #DHL #DHLITServices #GreatPlace #pyspark #azuredatabricks #snowflakedatabase Grow together Timely delivery of DHL packages around the globe in a way that ensures customer data are secure is in the core of what we do. You will provide project deliverables and day-to-day operation support and help investigate and resolve incidents. Sometimes, requirements or issues might get tricky, and this is where your expertise in development or the cooperation on troubleshooting with other IT support teams and specialists will come into play. For any requirements regarding BI use cases in an Azure environment, you are our superhero. The same applies when it comes to production and incidents that need to be fixed. Ready to embark on the journey? Here’s what we are looking for: Practical experience in programming using SQL, PySpark(Python), Azure Databricks and Azure Data Factory Experience in administration and configuration of Databricks Cluster Experience with Snowflake Database Knowledge of Data Vault data modeling (if not: high motivation to learn the modeling approach). Experiences with Streaming APIs like Kafka, CI/CD, XML/JSON, ADLS2 A comprehensive understanding of public cloud platforms, with a preference for Microsoft Azure Proven ability to work in a multi-cultural environment An array of benefits for you: Flexible Work Guidelines. Flexible Compensation Structure. Global Work cultural & opportunities across geographies. Insurance Benefit - Health Insurance for family, parents & in-laws, Term Insurance (Life Cover), Accidental Insurance.

Posted 4 days ago

Apply

8.0 - 10.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

Software Engineering Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS / Azure / GCP. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 8-10 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Education and Training Required: Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) Experience in AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) / GCP / Azure Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure / AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Location & Hours of Work (hybrid, Hyderabad ) (11:30am-8:30PM) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

What You'll Do: We’re searching for an experienced Full Stack developer to work on our product teams who is passionate about developing products that are simple, intuitive, and beautiful. We build big-data systems utilizing cutting-edge technologies and solutions that allow our developers to learn and develop while shipping amazing products. As part of our team, you’ll be working on our Big Data Insights Platform products and elevating user experiences. Our stack is based on Elixir, React, and GraphQL APIs. We are a growing company with small, focused engineering teams that are delivering innovative features in a fast-growing market. What You’ll Be Responsible For: You will work on our Product Engineering Teams You will design and enhance core business-driving services and products You will develop features in our databases, backend apps, and front end UI You will help architect and design service-driven UI via RESTful and GraphQL APIs You will work on ideas from different team members as well as your own Participate in our on-call rotation, Fix bugs rapidly and investigate and resolve production problems Attend daily stand-up meetings, planning sessions, encourage others, and collaborate at a rapid pace What You’ll Need: BS/MS in Computer Science, or other related fields. Or on-the-job experience. 5+ years of designing and programming in a work setting Proficient in backend (Elixir) and frontend (JavaScript/TypeScript) with real-world experience applying system and code design patterns. Strong in React or similar Experience building RESTful or GraphQL APIs Good knowledge of SQL Experience with Amazon Web Services (EC2, S3, RDS, Lambdas, EKS, etc.) Comfortable working with CI/CD and automation tools: Docker, Kubernetes, Terraform or similar Good DevOps skills (automate everything, infrastructure as code) Comfortable in an agile development environment Self-learner, hacker, and technology advocate who can work on anything Thrive in a fast-growing environment Proven track record of successful project delivery Excellent written and spoken English communication Nice-to-haves: You've worked on Enterprise-grade SaaS applications Experience leading a project / team Familiar with ElasticSearch, Snowflake/Databricks, ClickHouse or similar Big Data technology Loves startup culture where everyone's contributions are felt and loved. Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects Influence machine learning strategy for Digital programs and projects Make solution recommendations that appropriately balance speed to market and analytical soundness Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Create algorithms to extract information from large, multiparametric data sets Deploy algorithms to production to identify actionable insights from large databases Compare results from various methodologies and recommend optimal techniques Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Develop and embed automated processes for predictive model validation, deployment, and implementation Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment Lead discussions at peer review and use interpersonal skills to positively influence decision making Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices Facilitate cross-geography sharing of new ideas, learnings, and best-practices Requirements Bachelor of Science or Bachelor of Engineering at a minimum. 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Senior Software Engineer Department: IDP About Us HG Insights is the global leader in technology intelligence, delivering actionable AI driven insights through advanced data science and scalable big data solutions. Our Big Data Insights Platform processes billions of unstructured documents and powers a vast data lake, enabling enterprises to make strategic, data-driven decisions. Join our team to solve complex data challenges at scale and shape the future of B2B intelligence. What You’ll Do: Design, build, and optimize large-scale distributed data pipelines for processing billions of unstructured documents using Databricks, Apache Spark, and cloud-native big data tools Architect and scale enterprise-grade big-data systems, including data lakes, ETL/ELT workflows, and syndication platforms for customer-facing Insights-as-a-Service (InaaS) products. Collaborate with product teams to develop features across databases, backend services, and frontend UIs that expose actionable intelligence from complex datasets. Implement cutting-edge solutions for data ingestion, transformation, and analytics using Hadoop/Spark ecosystems, Elasticsearch, and cloud services (AWS EC2, S3, EMR). Drive system reliability through automation, CI/CD pipelines (Docker, Kubernetes, Terraform), and infrastructure-as-code practices. What You’ll Be Responsible For Leading the development of our Big Data Insights Platform, ensuring scalability, performance, and cost-efficiency across distributed systems. Mentoring engineers, conducting code reviews, and establishing best practices for Spark optimization, data modeling, and cluster resource management. Building & Troubleshooting complex data pipeline issues, including performance tuning of Spark jobs, query optimization, and data quality enforcement. Collaborating in agile workflows (daily stand-ups, sprint planning) to deliver features rapidly while maintaining system stability. Ensuring security and compliance across data workflows, including access controls, encryption, and governance policies. What You’ll Need BS/MS/Ph.D. in Computer Science or related field, with 5+ years of experience building production-grade big data systems. Expertise in Scala/Java for Spark development, including optimization of batch/streaming jobs and debugging distributed workflows. Proven track record with: Databricks, Hadoop/Spark ecosystems, and SQL/NoSQL databases (MySQL, Elasticsearch). Cloud platforms (AWS EC2, S3, EMR) and infrastructure-as-code tools (Terraform, Kubernetes). RESTful APIs, microservices architectures, and CI/CD automation37. Leadership experience as a technical lead, including mentoring engineers and driving architectural decisions. Strong understanding of agile practices, distributed computing principles, and data lake architectures. Airflow orchestration (DAGs, operators, sensors) and integration with Spark/Databricks 7+ years of designing, modeling and building big data pipelines in an enterprise work setting. Nice-to-Haves Experience with machine learning pipelines (Spark MLlib, Databricks ML) for predictive analytics. Knowledge of data governance frameworks and compliance standards (GDPR, CCPA). Contributions to open-source big data projects or published technical blogs/papers. DevOps proficiency in monitoring tools (Prometheus, Grafana) and serverless architectures. Show more Show less

Posted 4 days ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 12 The Team You will be an expert contributor and part of the Rating Organization’s Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organization’s critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities Responsibilities: Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications Bachelor’s degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development Total 12+ years of experience with 8+ years designing enterprise products, modern data stacks and analytics platforms 6+ years of hands-on experience contributing to application architecture & designs, proven software/enterprise integration design patterns and full-stack knowledge including modern distributed front end and back-end technology stacks 5+ years full stack development experience in modern web development technologies, Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Additional Preferred Qualifications Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential to translating complexity into clarity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quality independent opinions on creditworthiness, we enable growth across a wide variety of organizations, including businesses, governments, and institutions. S&P Global Ratings is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/ratings What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (“the Policy”) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy’s requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 312491 Posted On: 2025-04-07 Location: Mumbai, Maharashtra, India Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 11 The Team You will be an expert contributor and part of the Rating Organization’s Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organization’s critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities Responsibilities: Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications Bachelor’s degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development 10+ years of experience with 4+ years designing/developing enterprise products, modern tech stacks and data platforms 4+ years of hands-on experience contributing to application architecture & designs, proven software/enterprise integration design patterns and full-stack knowledge including modern distributed front end and back-end technology stacks 5+ years full stack development experience in modern web development technologies, Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Additional Preferred Qualifications Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential to translating complexity into clarity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quality independent opinions on creditworthiness, we enable growth across a wide variety of organizations, including businesses, governments, and institutions. S&P Global Ratings is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/ratings What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (“the Policy”) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy’s requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 312489 Posted On: 2025-05-14 Location: Mumbai, Maharashtra, India Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects Influence machine learning strategy for Digital programs and projects Make solution recommendations that appropriately balance speed to market and analytical soundness Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Create algorithms to extract information from large, multiparametric data sets Deploy algorithms to production to identify actionable insights from large databases Compare results from various methodologies and recommend optimal techniques Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Develop and embed automated processes for predictive model validation, deployment, and implementation Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment Lead discussions at peer review and use interpersonal skills to positively influence decision making Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices Facilitate cross-geography sharing of new ideas, learnings, and best-practices Requirements Bachelor of Science or Bachelor of Engineering at a minimum. 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Are you a skilled data professional with a passion to transform raw data into actionable insights, and a demonstrated history of learning and implementing new technologies? The Finance Data & Insights Team is an agile product team responsible for the development, production, and transformation of Financial data and reporting across the Consumer Community Banking (CCB). Our vision is to improve the lives of our people and increase value to the firm by leveraging the power of our data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Job Summary As an Analytical Solutions Sr Associate within the Consumer Community Banking (CCB) Finance Data & Insights Team, you will join an agile product team responsible for the development, production, and transformation of Financial data and reporting. Through your ability and passion to think beyond raw and disparate data, you will create data visualizations and intelligence solutions that will be utilized by the organization’s top leaders to reach out key strategic imperatives. You will help in identifying and assessing opportunities to eliminate manual processes and utilize automation tools such as ThoughtSpot , Alteryx & Tableau to bring to life automated solutions. You will extract, analyze, and summarize data for stakeholder requests, and play a role in transforming our analytics and data environment to a modernized cloud platform. Job Responsibilities Develop data visualization solutions utilizing ThoughtSpot/Tableau that provides intuitive insights to our key stakeholders Develop and enhance Alteryx workflows, collecting data from disparate data sources and summarizing the data as defined in requirements gathering with the stakeholders. Follow best practices to ensure that data is sourced from authoritative data sources Conduct thorough control testing of each component of the intelligence solution providing evidence that all data and visualization are providing accurate insights and evidence in the control process Seek to understand the stakeholder use cases empowering you to anticipate stakeholders requirements, questions, and objections Become a subject matter expert in these responsibilities and support team members in becoming more proficient themselves Lead intelligence solution requirements gathering sessions with varying levels of leadership, complete detailed project planning utilizing JIRA to record planned project execution steps Required Qualifications, Skills And Capabilities Bachelor’s degree in MIS or Computer Science, Mathematics, Engineering, Statistics or other quantitative or financial subject areas 8+ years’ experience working with data analytics projects, related to financial services domain 5+ years’ experience developing advanced data visualization and presentations with Tableau Experience with business intelligence analytic and data wrangling tools such as Alteryx, SAS, or Python Experience with relational databases optimizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses Experience in reporting development and testing, and ability to interpret unstructured data and draw objective inferences given known limitations of the data Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data Strong written and oral communication skills; ability to communicate effectively with all levels of management and partners from a variety of business functions Preferred Qualifications, Skills And Capabilities Experience with Hive, Spark SQL, Impala or other big-data query tools Experience with ThoughtSpot or similar tools empowering stakeholders to better understand their data Highly motivated, self-directed, curious to learn new technologies AWS, Databricks, Snowflake, or other Cloud Data Warehouse experience About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. The CCB Data & Analytics team responsibly leverages data across Chase to build competitive advantages for the businesses while providing value and protection for customers. The team encompasses a variety of disciplines from data governance and strategy to reporting, data science and machine learning. We have a strong partnership with Technology, which provides cutting edge data and analytics infrastructure. The team powers Chase with insights to create the best customer and business outcomes. Show more Show less

Posted 4 days ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Manager, Data Scientist The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a Senior Data Scientist to join our Enterprise IT Advanced Analytics team in our company. This role offers an exciting opportunity to work on initiatives that support our Finance, Procurement, Human Resources, and related business partners. If you have a strong background in machine learning, data science, and optionally large language models, and are passionate about leveraging technology to drive business success, we would love to hear from you. What Will You Do In This Role Understand the needs of the business and translate those needs into technical/DS/ML language. Assess the applicability and develop machine learning and data science solutions to given problems with a focus on AI/ML methods, including large language models. Leverage software engineering methods, including coding, testing, and documentation. Apply data engineering and data modeling skills as needed to deliver the AI/ML solutions. Use visual analytics techniques during the data and business exploration and model building, and, if needed, as a part of the client-facing solution. Define, document, and execute small projects or sub-projects, managing risks and stakeholder involvement. Manage junior staff as a part of the delivery. Within a project delivery, manage stakeholder relationships, including problem resolution and feedback collection. Collaborate cross-functionally to deliver more complex IT or business solutions What Should You Have 3+ years’ experience as a Data Scientist Required Advanced knowledge of statistical modelling and machine learning, in particular regression, classification and forecasting models Proficiency in R and/or Python, git Experience in software development and data engineering with a focus on ML solution build and deployment Ability to apply data engineering and data modelling on a level supporting DS delivery Ability to apply visual analytics techniques and tools on a level supporting DS delivery Experience in project management, including risk management and stakeholder management Working proficiency in English language Preferred Knowledge of large language models (prompt engineering, RAG, agents...), not limited to GPT Knowledge of mathematical optimization techniques Advanced MLOps skills – DS/AI/ML solution build, deployment, monitoring Experience in one or multiple of the supported business functions and types of data (structured and unstructured) existing across those functions Knowledge of cloud services, in particular AWS stack Knowledge of data science platforms Dataiku, Databricks Experience in a similar role in a global, research-oriented pharmaceutical company Proven track record of implementing machine learning and data science solutions Experience in managing stakeholder relationships Demonstrated experience in software development and data engineering Proven experience in project management Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Design, Data Engineering, Data Modeling, Data Science, Data Visualization, Machine Learning, Software Development, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 07/14/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R342325 Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

POSITION REPORTS TO: Senior Manager Application Development DEPARTMENT: Information Technology POSITION LOCATION: INDIA (REMOTE) COMPANY BUDGET: 18,00,000 - 23,00,000 Job Summary We are seeking a highly skilled Senior Data Warehouse Engineer to manage a single version of the truth of data, convert business entities into facts and dimensions, and integrate data from various sources. The role involves developing ETL processes, optimizing performance, and collaborating with cross-functional teams to support business analysis and decision-making. Key Responsibilities  Data Warehouse Development: Responsible for managing a single version of the truth and turning data into critical information and knowledge that can be used to make sound business decisions.  Dimensional Modeling: Convert business entities into facts and dimensions to provide a structured and efficient data model that supports accurate and insightful business analysis.  Data Integration: Collaborate with cross-functional teams to integrate data from various source systems such as Oracle NetSuite, Salesforce, Ragic, SQL Server, MySQL, and Excel files.  Data Transformation: Develop and maintain ETL processes using Microsoft SSIS, Python or similar ETL tools to load data into the data warehouse.  Performance Optimization: Optimize queries, indexes, and database structures to improve efficiency.  Requirement Analysis: Work closely with key users and business stakeholders to define requirements.  Documentation: Maintain comprehensive technical documentation for data warehouse processes, data integration, and configurations.  Team Collaboration: Mentor and guide junior team members, fostering a collaborative environment. Required Skills & Qualifications  Must have o Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. o 7+ years of experience with Microsoft SQL Server. o Expertise in building Data Warehouse using SQL Server. o Hands-on experience Dimensional Modeling using Facts and Dimensions. o Expertise in SSIS and Python for ETL development. o Strong experience in Power BI for reporting and data visualization. o Strong understanding of relational database design, indexing, and performance tuning. o Ability to write complex SQL scripts, stored procedures and views. o Experience with Git and JIRA. o Problem-solving mindset and analytical skills. o Excellent communication and documentation abilities.  Nice to Have o Experience with cloud-based SQL databases (e.g., Azure SQL, Azure Synapse). o Experience with cloud-based ETL solutions (Azure Data Factory, Azure Databricks). o Familiarity with CI/CD for database deployments and automation tools. o Knowledge of big data technologies like Snowflake Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

CSQ326R35 The Machine Learning (ML) Practice team is a highly specialized customer-facing ML team at Databricks facing an increasing demand for Large Language Model (LLM)-based solutions. We deliver professional services engagements to help our customers build, scale, and optimize ML pipelines, as well as put those pipelines into production. We work cross-functionally to shape long-term strategic priorities and initiatives alongside engineering, product, and developer relations, as well as support internal subject matter expert (SME) teams. We view our team as an ensemble: we look for individuals with strong, unique specializations to improve the overall strength of the team. This team is the right fit for you if you love working with customers, teammates, and fueling your curiosity for the latest trends in LLMs, MLOps, and ML more broadly. This role can be remote. The Impact You Will Have Develop LLM solutions on customer data, such as RAG architectures on enterprise knowledge repos, querying structured data with natural language, and content generation Help customers solve tough problems across industries like Health and Life Sciences, Finance, Retail, Startups, and many others Build, scale, and optimize customer data science workloads across industries and apply best-in-class MLOps to productionize these workloads Advise data teams on data science architecture, tooling, and best practices Provide thought leadership by presenting at conferences such as Data+AI Summit and mentoring the larger ML SME community in Databricks Collaborate cross-functionally with the product and engineering teams to define priorities and influence the product roadmap What We Look For Experience in building Generative AI applications, including RAG, agents, Text2SQL, fine-tuning, and deploying LLMs, using tools such as HuggingFace, Langchain, and OpenAI 4-10 years of hands-on industry data science experience, leveraging typical machine learning and data science tools including pandas, MLflow, scikit-learn, and PyTorch Experience in building production-grade ML or GenAI deployments on AWS, Azure, or GCP. Graduate degree in a quantitative discipline (Computer Science, Engineering, Statistics, Operations Research, etc.) or equivalent practical experience Experience in communicating and teaching technical concepts to both non-technical and technical audiences Passion for collaboration, life-long learning, and driving business value through ML [Preferred] Experience working with Databricks and Apache Spark™ About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Company Profile Our client is a global IT services company that helps businesses with digital transformation with offices in India and the United States. It helps businesses with digital transformation, provide IT collaborations and uses technology, innovation, and enterprise to have a positive impact on the world of business. With expertise is in the fields of Data, IoT, AI, Cloud Infrastructure and SAP, it helps accelerate digital transformation through key practice areas - IT staffing on demand, innovation and growth by focusing on cost and problem solving. Location & work – New Delhi (On-Site), WFO Employment Type - Full Time Profile – AI/ML Engineer Preferred experience – 3-5 Years The Role: We are seeking a highly skilled AI/ML Engineer with strong expertise in traditional statistical modeling using R and end-to-end ML pipeline configuration on Databricks. The ideal candidate will play a key role in designing, developing, and deploying advanced machine learning models, optimizing performance, and ensuring scalability across large datasets on the Databricks platform. Responsibilities: Design and implement traditional ML models using R (e.g., regression, classification, clustering, time-series). Develop and maintain scalable machine learning pipelines on Databricks. Configure and manage Databricks workspaces, clusters, and MLflow integrations for model versioning and deployment. Collaborate with data engineers, analysts, and domain experts to collect, clean, and prepare data. Optimize models for performance, interpretability, and business impact. Automate data workflows and model retraining pipelines using Databricks notebooks and job scheduling. Monitor model performance in production and implement enhancements as needed. Ensure model explainability, compliance, and reproducibility in production environments. Must-Have Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Minimum 3+ years of experience in machine learning and data science roles. Strong proficiency in R for statistical modeling and traditional ML techniques. Hands-on experience with Databricks: cluster configuration, workspace management, notebook workflows, and performance tuning. Experience with MLflow, Delta Lake, and PySpark (optional but preferred). Strong understanding of MLOps practices, model lifecycle management, and CI/CD for ML. Familiarity with cloud platforms such as Azure Databricks, AWS, or GCP. Preferred Qualification: Certification in Databricks or relevant ML/AI platforms is a plus. Excellent problem-solving and communication skills. Application Method Apply online on this portal or on email at careers@speedmart.co.in Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greetings from Colan Infotech!! Role - Data Scientist Experience - 3+ Years Job Location - Chennai/Bangalore Notice Period - Immediate to 30 Days Primary Skills Needed : AI/ML, Tensorflow, Django, Pytorch, NLP, Image processing,Gen AI,LLM Secondary Skills Needed : Keras, OpenCV, Azure or AWS Job Description:- Practical knowledge and working experience on Statistics and Operation Research methods. Practical knowledge and working experience in tools and frameworks like Flask, PySpark, Pytorch, tensorflow, keras, Databricks, OpenCV, Pillow/PIL, streamlit, d3js, dashplotly, neo4j. Good understanding of how to apply predictive and machine learning techniques like regression models, XGBoost, random forest, GBM, Neural Nets, SVM etc. Proficient with NLP techniques like RNN, LSTM and Attention based models and effectively handle readily available stanford, IBM, Azure, Open AI NLP models. Good understanding of SQL from a perspective of how to write efficient queries for pulling the data from database. Hands on experience on any version control tool (github, bitbucket). Experience of deploying ML models into production environment experience (MLOps) in any one of the cloud platforms like Azure and AWS Comprehend business issues and propose valuable business solutions. Design Factual or AI/profound learning models to address business issues. Design Statistical Models/ML/DL models and deploy them for production. Formulate what information is accessible from where and how to augment it. Develop innovative graphs for data comprehension using d3js, dashplotly and neo4j Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About the Role: (Please read all the requirements properly before applying for this job) We are looking for a highly skilled Senior Platform Engineer (Azure) who brings hands-on experience with distributed systems and a deep understanding of Azure Data Engineering and DevOps practices. The ideal candidate will be responsible for architecting and managing scalable, secure, and highly available cloud solutions on Microsoft Azure. Notice Period: Accepting Immediate to 15 days only Experience: 5-9 Yrs, in case of immediate joining with no notice 10 yrs would work. Candidates who are ready to relocate to Noida location can also apply. Communication should be excellent as you would be dealing with middle east people. Candidate required a valid Passport and Passport validity should be more than 6 months , as might be in project purposes they need to travel to Abu Dhabi onsite. Key Responsibilities: Design and manage distributed system architectures using Azure services such as Event Hub, Data Factory, ADLS Gen2, Cosmos DB, Synapse, Databricks, APIM, Function App, Logic App, and App Services . Implement infrastructure as code (IaC) using ARM templates and Terraform for consistent, automated environment provisioning. Deploy and manage containerized applications using Docker and orchestrate them with Azure Kubernetes Service (AKS) . Monitor and troubleshoot infrastructure and applications using Azure Monitor , Log Analytics , and Application Insights . Design and implement disaster recovery strategies , backups, and failover mechanisms to ensure business continuity. Automate provisioning, scaling, and infrastructure management to maintain system reliability and performance. Manage Azure environments across development, test, pre production, and production stages. Monitor and define job flows , set up proactive alerts, and ensure smooth ETL operations in Azure Data Factory and Databricks . Conduct root cause analysis and implement fixes for job failures. Work with Jenkins and Azure DevOps for automating CI/CD pipelines and deployment workflows. Write automation scripts using Python and Shell scripting for various operational tasks. Monitor VM performance metrics (CPU, memory, OS, network, storage) and recommend optimizations. Collaborate with development teams to improve application reliability and performance. Work in Agile environments with a proactive and results-driven mindset. Required Skills: Expertise in Azure services for data engineering and application deployment. Strong knowledge of Terraform , ARM templates, and CI/CD tools . Hands-on experience with Databricks , Data Factory , and Event Hub . Familiarity with Python , Shell scripting , Jenkins , and Azure DevOps . Deep understanding of container orchestration using AKS . Experience in monitoring , alerting , and log analysis for cloud-native applications. Ability to troubleshoot and support distributed systems in production environments. Excellent problem-solving and communication skills. Show more Show less

Posted 5 days ago

Apply

510.0 years

0 Lacs

Bhopal, Madhya Pradesh, India

On-site

Linkedin logo

Role : Data Engineers (510 Years of Experience) Experience : 510 years Location : Gurgaon, Pune, Bangalore, Chennai, Jaipur and Bhopal Skills : Python/Scala, SQL, ETL, Big Data (Spark, Kafka, Hive), Cloud (AWS/Azure/GCP), Data Warehousing Responsibilities : Build and maintain robust, scalable data pipelines and systems. Design and implement ETL processes to support analytics and reporting. Optimize data workflows for performance and scalability. Collaborate with data scientists, analysts, and engineering teams. Ensure data quality, governance, and security compliance. Required Skills Strong experience with Python/Scala, SQL, and ETL tools. Hands-on with Big Data technologies (Hadoop, Spark, Kafka, Hive, etc. Proficiency in Cloud Platforms (AWS/GCP/Azure). Experience with data warehousing (e.g., Redshift, Snowflake, BigQuery). Familiarity with CI/CD pipelines and version control systems. Nice To Have Experience with Airflow, Databricks, or dbt. Knowledge of real-time data processing (ref:hirist.tech) Show more Show less

Posted 5 days ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies