Home
Jobs

2633 Airflow Jobs - Page 40

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

About Us Innovation. Sustainability. Productivity. This is how we are Breaking New Ground in our mission to sustainably advance the noble work of farmers and builders everywhere. With a growing global population and increased demands on resources, our products are instrumental to feeding and sheltering the world. From developing products that run on alternative power to productivity-enhancing precision tech, we are delivering solutions that benefit people – and they are possible thanks to people like you. If the opportunity to build your skills as part of a collaborative, global team excites you, you’re in the right place. Grow a Career. Build a Future! Be part of this company at the forefront of agriculture and construction, that passionately innovates to drive customer efficiency and success. And we know innovation can’t happen without collaboration. So, everything we do at CNH Industrial is about reaching new heights as one team, always delivering for the good of our customers. Job Purpose The CFD Analysis Engineer will be responsible for providing fluid/thermal analysis for agricultural (tractors, combines, harvesters, sprayers) and construction machines (excavators, wheel loaders, loader backhoes). As a member of the CFD Team, he will be supporting the design of components and subsystems like: A/C & HVAC systems Engine cooling packages Hydraulics Transmissions Engine air intakes & exhausts Key Responsibilities Develops virtual simulation models using CFD (Computational Fluid Dynamics) for the evaluation of engineering designs of agricultural and construction machinery. Makes recommendations to peers and direct manager based on sound engineering principles, practices and judgment pertaining to thermal/fluid problems as a contribution to the overall engineering and manufacturing objectives. Utilizes Star CCM+, Ensight, ANSYS Fluent, GT-Power, Actran Creo, TeamCenter and relevant software to develop and simulate designs for cooling packages, exhaust systems, engine air intakes, HVAC systems, transmissions, and other relevant components being developed and/or improved. Performs engineering calculations for emissions, chemical reactions, sprays, thermal, airflow, hydraulic, aero-acoustic, particle flows, and refrigeration problems to determine the size and performance of assemblies and parts and to solve design problems. Incorporates engineering standards, methodologies and global product development processes into daily work tasks. Experience Required MS Degree in Engineering or comparable program, with 8 years of professional industry experience. Good knowledge of the Computational Fluid Dynamics field. Some knowledge in the areas of underhood engine cooling, two-phase flows, and climatization. Knowledge of exhaust after treatment analysis; familiarity with SCR (Selective Catalytic Reactors), DPF (Diesel Particulate Filters), or DOC (Diesel Oxidation Catalysts). Some basic knowledge and understanding of aero-acoustics and fan noise Preferred Qualifications Master’s degree in mechanical engineering from reputed institute Doctoral degree (Ph.D.) is a plus What We Offer We offer dynamic career opportunities across an international landscape. As an equal opportunity employer, we are committed to delivering value for all our employees and fostering a culture of respect. Benefits At CNH, we understand that the best solutions come from the diverse experiences and skills of our people. Here, you will be empowered to grow your career, to follow your passion, and help build a better future. To support our employees, we offer regional comprehensive benefits, including: Flexible work arrangements Savings & Retirement benefits Tuition reimbursement Parental leave Adoption assistance Fertility & Family building support Employee Assistance Programs Charitable contribution matching and Volunteer Time Off Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking a highly skilled and motivated MLOps Site Reliability Engineer (SRE) to join our team. In this role, you will be responsible for ensuring the reliability, scalability, and performance of our machine learning infrastructure. You will work closely with data scientists, machine learning engineers, and software developers to build and maintain robust and efficient systems that support our machine learning workflows. This position offers an exciting opportunity to work on cutting-edge technologies and make a significant impact on our organization's success. Design, implement, and maintain scalable and reliable machine learning infrastructure. Collaborate with data scientists and machine learning engineers to deploy and manage machine learning models in production. Develop and maintain CI/CD pipelines for machine learning workflows. Monitor and optimize the performance of machine learning systems and infrastructure. Implement and manage automated testing and validation processes for machine learning models. Ensure the security and compliance of machine learning systems and data. Troubleshoot and resolve issues related to machine learning infrastructure and workflows. Document processes, procedures, and best practices for machine learning operations. Stay up-to-date with the latest developments in MLOps and related technologies. Qualifications Required: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Site Reliability Engineer (SRE) or in a similar role. Strong knowledge of machine learning concepts and workflows. Proficiency in programming languages such as Python, Java, or Go. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with containerization technologies like Docker and Kubernetes. Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Strong problem-solving skills and the ability to troubleshoot complex issues. Excellent communication and collaboration skills. Preferred Master's degree in Computer Science, Engineering, or a related field. Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn. Knowledge of data engineering and data pipeline tools such as Apache Spark, Apache Kafka, or Airflow. Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK stack. Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible. Experience with automated testing frameworks for machine learning models. Knowledge of security best practices for machine learning systems and data. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description: The Future Begins Here : At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity: Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: As a Data Engineer you will be building and maintaining data systems and construct datasets that are easy to analyse and support Business Intelligence requirements as well as downstream systems. Responsibilities: Develops and maintains scalable data pipelines and builds out new integrations using AWS native technologies to support continuing increases in data source, volume, and complexity. Automate and Manage job scheduling for ETL processes across various applications and platforms within an enterprise. Design, implement, and maintain robust CI/CD pipelines to automate software deployments and data pipelines. Implements processes and systems to drive data reconciliation, monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes that depend on it. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Works closely with enterprise teams including Enterprise Architecture, Security, and Enterprise Data Backbone Engineering to design and develop data integration patterns/solutions along with proper data models supporting different data and analytics use cases. Skills and Qualifications: Bachelors’ Degree, from an accredited institution in Engineering, Computer Science, or related field. 5+ years of total experience with data integration tools, workflow automation and DevOps. 3+ years of Experience with Tidal Automation and Tidal Repository for scheduling jobs and managing batch processes across different environments. Proficiency in Amazon Managed Apache Airflow to automate workflows and data pipelines. Strong experience with CI/CD tools like Github, Gitlab or Jforg. Strong programming and scripting skills in Python. Focusing on AWS native services and optimizing the data landscape through the adoption of these services. Experience with Agile development methodologies. Curiosity and adaptability to learn new technologies and improve existing processes. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations: IND - Bengaluru Worker Type: Employee Worker Sub-Type: Regular Time Type: Full time Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Fair knowledge of programming languages like Python /SQLand advanced SQL Having good business knowledge like sales cycle , Inventory , supply application Fair knowledge of Snowflake and GItHUB , DBT Sigma / DOMO , Airflow , HVR , SAP Ability to identify, analyze, and resolve problems logically Ability to troubleshoot and identify root cause Responsibilities Maintain production systems reliability through correct utilization of IT standards and governance processes Collaborate with business / functional team, develop detailed plans and accurate estimates for completion of build, system testing and implementation of project Review program codes and suggest correction in case of any errors Conduct performance tuning to improve system performance over multiple business processes Monitor and setup the jobs in test/ sandbox systems for testing Making sure the correct test data is arranged for checking on reports Working with business to check on the testing and sign off of the reports Qualifications Snowflake : SQL DBT Airflow Github SAP and HVR : sap integration with HVR Sigma & DOMO for reporting Proficiency in Python. Understanding of Airflow architecture and concepts. Experience with SQL and database design. About Us ABOUT US Bristlecone is the leading provider of AI-powered application transformation services for the connected supply chain. We empower our customers with speed, visibility, automation, and resiliency – to thrive on change. Our transformative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement and Digitalization are positioned around key industry pillars and delivered through a comprehensive portfolio of services spanning digital strategy, design and build, and implementation across a range of technology platforms. Bristlecone is ranked among the top ten leaders in supply chain services by Gartner. We are headquartered in San Jose, California, with locations across North America, Europe and Asia, and over 2,500 consultants. Bristlecone is part of the $19.4 billion Mahindra Group. Equal Opportunity Employer Bristlecone is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status . Information Security Responsibilities Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System. Take part in information security training and act while handling information. Report all suspected security and policy breach to InfoSec team or appropriate authority (CISO). Understand and adhere to the additional information security responsibilities as part of the assigned job role. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Roles And Responsibilities Understand client’s requirement and provide effective and efficient solution in Snowflake. Understanding data transformation and translation requirements and which tools to leverage to get the job done. Ability to do Proof of Concepts (POCs) in areas that need R&D on cloud technologies. Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Technical And Functional Skills Master’s / Bachelor’s degree in Engineering, Analytics, or a related field. Total 7+ years of experience with relevant ~4+ years of Hands-on experience with Snowflake utilities – SnowSQL, SnowPipe, Time travel, Replication, Zero copy cloning. Strong working knowledge on Python. Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. In-depth understanding of data warehouse and ETL tools. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Experience Snowflake API’s is mandatory. Candidate must have strong knowledge in Scheduling and Monitoring using Airflow DAGs. Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions. Should have sound knowledge in Data architecture and design. Should have hands on experience in developing Python scripts for data manipulation. Snowflake snowpro core certification. Developing scripts using Unix, Python, etc. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less

Posted 1 week ago

Apply

5.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for AWS Data Science with Devops and Databricks at Tata Consultancy Services at 14th June 2025. Hiring For : AWS Data Science with Devops and Databricks Mandatory Skills: SQL, Pyspark/Python,AWS, Databricks, Snowflak,S3, EMR, EC2, Airflow, Lambda Experience : 5-15 years Mode of interview: in-person walk in drive Date of interview: 14 June 2025 Venue : Zone 3 Auditorium, Tata Consultancy Services, Sahyadri Park, Rajiv Gandhi Infotech Park, Hinjewadi Phase 3, Pune – 411057 If you are interested in this exciting opportunity, Please share your updated resume on jeena.james1@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: Current Organization Total Experience: Relevant Experience in Devops with Databricks: Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Willing to attend walk in on 14th June: (Yes/No) Note: only Eligible candidates with Relevant experience will be contacted further Thanks & Regards, Jeena James, Website: http://www.tcs.com Email: jeena.james1@tcs.com Human Resource - Talent Acquisition Group, Tata Consultancy Services Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Company Description Make an impact at a global and dynamic investment organization When you join CPP Investments, you are joining one of the world’s most admired and respected institutional investors. With more than $600 billion in assets under management, CPP Investments is a professional investment management organization that globally invests the funds of the Canada Pension Plan (CPP) to help ensure it is financially sustainable for generations of working and retired Canadians. CPP Investments invests across regions and asset classes to build a globally diversified portfolio. It holds assets in public equity, private equity, real estate, infrastructure, and fixed income, and the CPP Fund is projected to reach $3 trillion in assets by 2050. The organization is headquartered in Toronto with offices in Hong Kong, London, Mumbai, New York City, San Francisco, São Paulo, and Sydney. CPP Investments successfully attracts, selects, and retains talented individuals from top-tier institutions worldwide. Join our team for access to: Stimulating work in a fast-paced and intellectually challenging environment Accelerated exposure and responsibility Global career development opportunities Diverse and inspiring colleagues and approachable leaders A hybrid-flexible work environment with an emphasis on in-person collaboration A culture rooted in principles of integrity, partnership, and high performance An organization with an important social purpose that positively impacts lives If you have a passion for performance, value a collegial and collaborative culture, and approach work with the highest integrity, invest your career here. Job Description As a Senior Engineer you will design, build, test and support value-add technology solutions and products. You will also work in close partnership with business and technology teams and will acquire sufficient understanding of business and technology to apply critical thought to business and technology requests. Role Specific Accountabilities Design, build, test and support medium complexity technology solutions or enhancements with independence, to enable business capabilities, in an Agile environment that includes business and T&D partner teams. Apply advanced understanding of engineering best practices and drive continuous improvement across the team through coaching and influencing Demonstrate a sufficient understanding of the technical landscape and the business capabilities it supports to apply critical thought to business requests Foster and enable agility and innovation through experimentation and early feedback ensuring responsiveness to evolving business needs Succinctly frame problems, engage appropriately with colleagues to think deeply about broad problems and gain buy-in on well-reasoned recommendations Facilitate root-cause-analysis of operational incidents impacting the products you support Demonstrate ability to independently research and master new complex technologies Adhere to Agile SDLC and execute related duties as required. Foster collaboration and mentorship promoting a culture of feedback, learning and professional growth Maintain strong relationships with business partners, peer IT teams and vendor partners Qualifications Undergraduate degree or college diploma in related field (e.g. Engineering, Computer Science). 7+ years of relevant experience. Strong Experience working with various programming languages (Python, C++, Java, etc.) Expertise in AWS services (EMR, Glue, Airflow, Lake formation, Iceberg, Lambda, API Gateway, SNS, SQS, Step Functions, S3, Data Zone, IAM & Security, VPC & Networking) Experience in API development and integration. Experience with front-end technologies (e.g., React, Angular) is a plus. Familiarity with AI/ML concepts and tools is a plus. Excellent problem-solving skills and attention to detail. Experience with software development concepts, including version control, testing methodologies, and agile development practices Ability to write clean, readable, and well-documented code, while paying attention to details and adhering to coding standards Additional Information Visit our LinkedIn Career Page or Follow us on LinkedIn. At CPP Investments, we are committed to diversity and equitable access to employment opportunities based on ability. We thank all applicants for their interest but will only contact candidates selected to advance in the hiring process. Our Commitment To Inclusion And Diversity In addition to being dedicated to building a workforce that reflects diverse talent, we are committed to fostering an inclusive and accessible experience. If you require an accommodation for any part of the recruitment process (including alternate formats of materials, accessible meeting rooms, etc.), please let us know and we will work with you to meet your needs. Disclaimer CPP Investments does not accept resumes from employment placement agencies, head-hunters or recruitment suppliers that are not in a formal contractual arrangement with us. Our recruitment supplier arrangements are restricted to specific hiring needs and do not include this or other web-site job postings. Any resume or other information received from a supplier not approved by CPP Investments to provide resumes to this posting or web-site will be considered unsolicited and will not be considered. CPP Investments will not pay any referral, placement or other fee for the supply of such unsolicited resumes or information. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

The candidate should have a strong understanding of HVAC systems and design principles and be able to apply their knowledge to create innovative and efficient solutions for different upcoming projects and proposals. Comprehensive knowledge of different types of HVAC systems and their applications. Hands-on experience in HVAC design software tools like HAP/Elite, duct sizer, pipe sizer, etc Ability to perform load calculations to determine system sizing and equipment selection. Perform reviews of vendor deliverables, raise TQs, TBEs and also participate in review meetings with vendors and other stakeholders. Ability to interpret P&IDs, D&IDs, and airflow diagrams. Familiarity with codes and standards pertaining to offshore HVAC systems and ducting, etc like ISO 15138, ISO 7547, ASHRAE, NFPA, SMACNA, etc. Key Responsibilities Design and Analysis: Develop HVAC system designs for various offshore HVAC (High Voltage Alternating Current) substations and HVDC (High Voltage Direct Current) Offshore Converter Platforms in Offshore Wind Farms across Europe, USA, Korea, Taiwan, etc. Conduct feasibility studies, load calculations, and energy efficiency assessments of platform HVA/C systems and temporary HVA/C systems required during fabrication at yard, transportation, and offshore HUC. Equipment Selection: Select appropriate HVAC equipment, components, and controls based on project requirements, budget constraints, and energy efficiency goals. Preparation of BOQ / MR / TBE: Prepare BOQ of HVA/C systems, ducting and piping quantities, etc., prepare MR for Supply Chain Management team to initiate procurement and evaluate vendor offers to suit the project specifications Drawing Preparation/Review: Create/review detailed HVAC system drawings, including schematics, layouts, and specifications from Engineering consultants, and ensure compliance with project specifications and applicable codes and standards. Code Compliance: Ensure that all designs comply with relevant codes, industry standards, and local regulations of the respective countries. Project Coordination: Collaborate with other department engineers, contractors, and fabrication yard during construction to ensure smooth project execution. Energy Efficiency: Implement strategies to optimize energy efficiency and reduce operating costs for HVAC systems. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 12+ years of hands on experience Position: Senior Manager Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Deep expertise in AI/ML solution design, including supervised and unsupervised learning, deep learning, NLP, and optimization. Strong hands-on experience with ML/DL frameworks like TensorFlow, PyTorch, scikit-learn, H2O, and XGBoost. Solid programming skills in Python, PySpark, and SQL, with a strong foundation in software engineering principles. Proven track record of building end-to-end AI pipelines, including data ingestion, model training, testing, and production deployment. Experience with MLOps tools such as MLflow, Airflow, DVC, and Kubeflow for model tracking, versioning, and monitoring. Understanding of big data technologies like Apache Spark, Hive, and Delta Lake for scalable model development. Expertise in AI solution deployment across cloud platforms like GCP, AWS, and Azure using services like Vertex AI, SageMaker, and Azure ML. Experience in REST API development, NoSQL database design, and RDBMS design and optimizations. Familiarity with API-based AI integration and containerization technologies like Docker and Kubernetes. Proficiency in data storytelling and visualization tools such as Tableau, Power BI, Looker, and Streamlit. Programming skills in Python and either Scala or R, with experience using Flask and FastAPI. Experience with software engineering practices, including use of GitHub, CI/CD, code testing, and analysis. Proficient in using AI/ML frameworks such as TensorFlow, PyTorch, and SciKit-Learn. Skilled in using Apache Spark, including PySpark and Databricks, for big data processing. Strong understanding of foundational data science concepts, including statistics, linear algebra, and machine learning principles. Knowledgeable in integrating DevOps, MLOps, and DataOps practices to enhance operational efficiency and model deployment. Experience with cloud infrastructure services like Azure and GCP. Proficiency in containerization technologies such as Docker and Kubernetes. Familiarity with observability and monitoring tools like Prometheus and the ELK stack, adhering to SRE principles and techniques. Cloud or Data Engineering certifications or specialization certifications (e.g. Google Professional Machine Learning Engineer, Microsoft Certified: Azure AI Engineer Associate – Exam AI-102, AWS Certified Machine Learning – Specialty (MLS-C01), Databricks Certified Machine Learning) Nice To Have Experience implementing generative AI, LLMs, or advanced NLP use cases Exposure to real-time AI systems, edge deployment, or federated learning Strong executive presence and experience communicating with senior leadership or CXO-level clients Roles And Responsibilities Lead and oversee complex AI/ML programs, ensuring alignment with business strategy and delivering measurable outcomes. Serve as a strategic advisor to clients on AI adoption, architecture decisions, and responsible AI practices. Design and review scalable AI architectures, ensuring performance, security, and compliance. Supervise the development of machine learning pipelines, enabling model training, retraining, monitoring, and automation. Present technical solutions and business value to executive stakeholders through impactful storytelling and data visualization. Build, mentor, and lead high-performing teams of data scientists, ML engineers, and analysts. Drive innovation and capability development in areas such as generative AI, optimization, and real-time analytics. Contribute to business development efforts, including proposal creation, thought leadership, and client engagements. Partner effectively with cross-functional teams to develop, operationalize, integrate, and scale new algorithmic products. Develop code, CI/CD, and MLOps pipelines, including automated tests, and deploy models to cloud compute endpoints. Manage cloud resources and build accelerators to enable other engineers, with experience in working across two hyperscale clouds. Demonstrate effective communication skills, coaching and leading junior engineers, with a successful track record of building production-grade AI products for large organizations. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 4+ years of hands on experience Position: Senior Associate Industry: Supply Chain/Forecasting/Financial Analytics Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Strong supply chain domain knowledge (inventory planning, demand forecasting, logistics) Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Experience using at least one major cloud platform (AWS, Azure, GCP), such as: AWS: Experience with AWS SageMaker, Redshift, Glue, Lambda, QuickSight Azure: Experience with Azure ML Studio, Synapse Analytics, Data Factory, Power BI GCP: Experience with BigQuery, Vertex AI, Dataflow, Cloud Composer, Looker Experience developing, deploying, and monitoring ML models on cloud infrastructure Expertise in Python, SQL, data orchestration, and cloud-native data tools Hands-on experience with cloud-native data lakes and lakehouses (e.g., Delta Lake, BigLake) Familiarity with infrastructure-as-code (Terraform/CDK) for cloud provisioning Knowledge of visualization tools (PowerBI, Tableau, Looker) integrated with cloud backends Strong command of statistical modeling, testing, and inference Advanced capabilities in data wrangling, transformation, and feature engineering Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Airflow) Strong communication and stakeholder engagement skills at the executive level Roles And Responsibilities Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform? Are you an expert with Big Data Technologies? Have you looked under the hood of these systems? Are you interested in Open Source? If you answered “Yes” to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary We are seeking a skilled and proactive Data Engineer with a strong background in ETL development and a focus on integrating data quality frameworks. In this role, you will be responsible for designing, developing, and maintaining ETL pipelines while ensuring data quality is embedded throughout the process. You will play a crucial role in building robust and reliable data pipelines that deliver high-quality data to our data warehouse and other systems. Responsibilities Design, develop, and implement ETL processes to extract data from various source systems, transform it according to business requirements, and load it into target systems (e.g., data warehouse, data lake) Implement data validation and error handling within ETL pipelines. Build and maintain scalable, reliable, and efficient data pipelines. Design and implement data quality checks, validations, and transformations within ETL processes. Automate data quality monitoring, alerting, and reporting within ETL pipelines. Develop and implement data quality rules and standards within ETL processes. Integrate data from diverse sources, including databases, APIs, flat files, and cloud-based systems. Utilize ETL tools and technologies (e.g., SnapLogic, Informatica PowerCenter, Talend, AWS Glue, Apache Airflow, Azure Data Factory, etc.). Write SQL queries to extract, transform, load, and validate data. Use scripting languages (e.g., Python) to automate ETL processes, data quality checks, and data transformations. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Locations : Mumbai | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We’re a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world’s most complex problems. Leveraging BCG’s global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Build AI/ML technology stacks from concept to production, including data pipelines, model training, and deployment. Develop and optimize Generative AI workflows, including prompt engineering, fine-tuning (LoRA, QLoRA), retrieval-augmented generation (RAG), and LLM-based applications. Work with Large Language Models (LLMs) such as Llama, Mistral, and GPT, ensuring efficient adaptation for various use cases. Design and implement AI-driven automation using agentic AI systems and orchestration frameworks like Autogen, LangGraph, and CrewAI. Leverage cloud AI infrastructure (AWS, Azure, GCP) for scalable deployment and performance tuning. Collaborate with cross-functional teams to deliver AI-driven solutions. What You'll Bring Bachelor’s, Master’s, or PhD in Computer Science, Data Science, Mathematics, Statistics, Engineering or a related field. 2+ years of experience in AI/ML, with expertise in Generative AI and LLMs. Strong proficiency in Python and experience with AI/ML frameworks like PyTorch and TensorFlow. Knowledge of advanced prompt engineering techniques (Chain of Thought, Few-Shot, Self-Consistency). Experience in AI workflow automation and model orchestration. Hands-on experience with API development using Flask or Django. Familiarity with data processing frameworks like Databricks and Airflow. Strong analytical skills and the ability to work in a collaborative environment #BCGXjob Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 1 week ago

Apply

5.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for Data Scientist at Tata Consultancy Services at 14th June 2025. Hiring For : Data Scientist Mandatory Skills: Data Science & CICD DevOps & Databricks, AWS, SQL, Pyspark/Python,AWS, Databricks, Snowflak,S3, EMR, EC2, Airflow, Lambda Walk in Location: Pune Experience : 5-15 years Mode of interview: in-person walk in drive Date of interview: 14 June 2025 Venue : Zone 3 Auditorium, Tata Consultancy Services, Sahyadri Park, Rajiv Gandhi Infotech Park, Hinjewadi Phase 3, Pune – 411057 If you are interested in this exciting opportunity, Please share your updated resume on archana.parmeshwar@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: Current Organization Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Willing to attend walk in on 14th June: (Yes/No) Note: only Eligible candidates with Relevant experience will be contacted further Thanks & Regards, Archana Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Fynd is India’s largest omnichannel platform and a multi-platform tech company specializing in retail technology and products in AI, ML, big data, image editing, and the learning space. It provides a unified platform for businesses to seamlessly manage online and offline sales, store operations, inventory, and customer engagement. Serving over 2,300 brands, Fynd is at the forefront of retail technology, transforming customer experiences and business processes across various industries. About You As a TPM you will be acting as a bridge between business and engineering teams. You will working on complex business constraints that translate into product requirements and features. You bring technical knowledge to the team, taking the project from prototype to launch in tight timelines. A people’s person who can provide strong leadership and inspire teams to build a world-class products. What will you do at Fynd? Gather requirements from diverse teams and stakeholders Work with Platform Architects and Engineers to convert these requirements to an implementation Work closely with engineers to prioritize product features and requirements Own the execution of the sprint by collaborating with multiple engineering and product teams within the org Be responsible for the delivery of these sprints both from a timeline and quality point of view Manage technical and product risks and unblock engineering by helping mitigate them Provide accurate visibility in terms of features readiness, issues with other engineering teams Who are we looking for? You, if you can walk the talk and convince others to walk with you Someone who can distinguish between the important and the urgent, and make sure both are addressed A leader who has the vision to see more than 14 million outcomes and pick the one where Thanos is defeated and Avengers thrive Juggling time, resources and priorities feels as natural as data charts and spreadsheets Someone who listens to everybody, distils information and makes stuff happen! Some Specific Requirements Basic Knowledge of Technology, data orchestration tools and frameworks such as Apache Airflow,API Integrations, Micro-services Architecture, CI/CD etc Strong communication skills Knowledge of data modeling and ETL (Extract, Transform, Load) processes. Familiarity with data streaming and real-time data processing technologies. Proficiency in data visualization tools (e.g., Tableau, Power BI) to create reports and dashboards. Ability to automate repetitive tasks and workflows using scripting or automation tools. A commitment to staying current with evolving data technologies and industry trends. Ability to explain technical concepts/flows to a non-technical audience Clear written communication skills. You must be able to clearly articulate flows for engineers and SDETs to understand deeply Build strong relationships and collaborate with a diverse team containing engineering, product and business stakeholders Effective Delegation Must know how to build ownership and execution within the team without micro-managing Proficiency with data platform technologies, including database management systems (e.g., MySQL, PostgreSQL, or MongoDB). Knowledge of server and storage hardware, virtualization, and cloud computing. Strong Attention to Detail Growth Mindset to learn skills while performing the role 4 - 7years of experience in a Business Analyst/Project Manager role Some experience with Bug Tracking tools like JIRA, Confluence, Asana, Redmine or Azure Devops Strong analytical and problem-solving skills, with the ability to make data-driven decisions. Excellent communication and collaboration skills, including the ability to work effectively with cross-functional teams. Strong knowledge of data technologies, databases, and data analytics tools. Familiarity with cloud-based data solutions (e.g., AWS, Azure, GCP). Strong knowledge of ETL tools and techniques, including data extraction, transformation, and loading from various sources. Experience with Change Data Capture (CDC) methodologies to capture real-time data changes for synchronization. Deep understanding of machine learning concepts and their application to data-driven decision-making. Proficiency in data integration tools, including Big DataOps Platforms, to streamline data collection and management. Familiarity with workflow management systems for process automation and orchestration. Knowledge of artificial intelligence (AI) technologies and their integration into data platforms to enhance automation, prediction, and decision support. Strong problem-solving skills to address complex technical and business challenges. Ability to communicate and present complex technical concepts to non-technical stakeholders. Leadership skills to guide cross-functional teams in product development. What do we offer? Growth Growth knows no bounds, as we foster an environment that encourages creativity, embraces challenges, and cultivates a culture of continuous expansion. We are looking at new product lines, international markets and brilliant people to grow even further. We teach, groom and nurture our people to become leaders. You get to grow with a company that is growing exponentially. Flex University: We help you upskill by organising in-house courses on important subjects Learning Wallet: You can also do an external course to upskill and grow, we reimburse it for you. Culture Community and Team building activities Host weekly, quarterly and annual events/parties. Wellness Mediclaim policy for you + parents + spouse + kids Experienced therapist for better mental health, improve productivity & work-life balance We work from the office 5 days a week to promote collaboration and teamwork. Join us to make an impact in an engaging, in-person environment! Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

JOB DESCRIPTION You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 1 week ago

Apply

5.0 years

5 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Job Title: Sr. Software/Sr.Data Engineer – Databricks & PySpark Experience: 3–5 Years, Work Location : Hyderabad Job Summary: We are seeking a skilled Data Engineer with 3–5 years of hands-on experience in Databricks and PySpark . Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks and PySpark.Collaborate with data scientists, analysts, and other stakeholders to gather requirements and deliver data solutions.Implement ETL processes to ingest and transform structured and unstructured data from various sources.Perform data wrangling, cleansing, and validation to ensure quality and accuracy.Monitor, troubleshoot, and optimize performance of data pipelines and workflows.Ensure adherence to data governance and security policies. Required Skills: 3–4 years of hands-on experience in PySpark for large-scale data processing.Strong experience with Databricks platform (including notebooks, clusters, and Delta Lake).Proficiency in SQL and Python.Experience with cloud platforms (preferably Azure or AWS).Good understanding of data modelling, partitioning, and performance tuning.Familiarity with version control tools (e.g., Git) and CI/CD practices.Good communication skills Preferred Skills (Good to Have): Knowledge of Delta Lake, Apache Airflow, or Data Factory.Exposure to Agile/Scrum methodologies.Experience working with streaming data is a plus. Interested candidates can share their updated profiles to radhika.laxmi@komhar.com Job Type: Full-time Pay: ₹503,603.23 - ₹800,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift UK shift Work Location: In person Expected Start Date: 06/06/2025

Posted 1 week ago

Apply

5.0 years

50 Lacs

Dehradun, Uttarakhand, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : INR 5000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Precanto) (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. We’re looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering We’re looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) What You Will Do- Build and optimize machine learning models — from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform Basic Skills Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. Advanced Skills Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

5.0 years

4 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Senior Backend Software Engineer-Python Hyderabad, India Business Management 312713 Job Description About The Role: Grade Level (for internal use): 03 Who We Are Kensho is a 120-person AI and machine learning company within S&P Global. With expertise in Machine Learning and data discovery, we develop and deploy novel solutions for S&P Global and its customers worldwide. Our solutions help businesses harness the power of data and Artificial Intelligence to innovate and drive progress. Kensho's solutions and research focus on speech recognition, entity linking, document extraction, automated database linking, text classification, natural language processing, and more. Are you looking to solve hard problems and enjoy working with teammates with diverse perspectives? If so, we would love to help you excel here at Kensho. About The Team Kensho’s Applications group develops the web apps and APIs that deliver Kensho’s AI capabilities to our customers. Our teams are small, product-focused, and intent on shipping high-quality code that best leverages our efforts. We’re collegial, humble, and inquisitive, and we delight in learning from teammates with backgrounds, skills, and interests different from our own. Kensho Link team, within the Applications Department, is a machine learning service that allows users to map entities in their datasets with unique entities drawn from S&P Global’s world-class company database with precision and speed. Link started as an internal Kensho project to help S&P Global Market Intelligence Team to integrate datasets more quickly into their platform. It uses ML based algorithms trained to return high quality links, even when the data inputs are incomplete or contain errors. In simple words, Kensho’s Link product helps in connecting the disconnected information about a company at one place – and it does so with scale. Link leverages a variety of NLP and ML techniques to process and link millions of company entities in hours. About The Role As a Senior Backend Engineer you will develop reliable, secure, and performant APIs that apply Kensho’s AI capabilities to specific customer workflows. You will collaborate with colleagues from Product, Machine Learning, Infrastructure, and Design, as well as with other engineers within Applications. You have a demonstrated capacity for depth, and are comfortable working with a broad range of technologies. Your verbal and written communication is proactive, efficient, and inclusive of your geographically-distributed colleagues. You are a thoughtful, deliberate technologist and share your knowledge generously. Equivalent to Grade 11 Role (Internal) You will: Design, develop, test, document, deploy, maintain, and improve software Manage individual project priorities, deadlines, and deliverables Work with key stakeholders to develop system architectures, API specifications, implementation requirements, and complexity estimates Test assumptions through instrumentation and prototyping Promote ongoing technical development through code reviews, knowledge sharing, and mentorship Optimize Application Scaling: Efficiently scale ML applications to maximize compute resource utilization and meet high customer demand. Address Technical Debt: Proactively identify and propose solutions to reduce technical debt within the tech stack. Enhance User Experiences: Collaborate with Product and Design teams to develop ML-based solutions that enhance user experiences and align with business goals. Ensure API security and data privacy by implementing best practices and compliance measures. Monitor and analyze API performance and reliability, making data-driven decisions to improve system health. Contribute to architectural discussions and decisions, ensuring scalability, maintainability, and performance of the backend systems. Qualifications At least 5+ years of direct experience developing customer-facing APIs within a team Thoughtful and efficient communication skills (both verbal and written) Experience developing RESTful APIs using a variety of tools Experience turning abstract business requirements into concrete technical plans Experience working across many stages of the software development lifecycle Sound reasoning about the behavior and performance of loosely-coupled systems Proficiency with algorithms (including time and space complexity analysis), data structures, and software architecture At least one domain of demonstrable technical depth Familiarity with CI/CD practices and tools to streamline deployment processes. Experience with containerization technologies (e.g., Docker, Kubernetes) for application deployment and orchestration. Technologies We Love Python, Django, FastAPI mypy, OpenAPI RabbitMQ, Celery, Kafka OpenSearch, PostgreSQL, Redis Git, Jsonnet, Jenkins, Docker, Kubernetes Airflow, AWS, Terraform Grafana, Prometheus ML Libraries: PyTorch, Scikit-learn, Pandas What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 312713 Posted On: 2025-04-15 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

10.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Lead, Application Development Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 316185 Job Description About The Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316185 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are hiring for our client - based in Hyderabad- GCC - who are yet to establish their presence in India. Job Summary : We are looking for a Senior Data Engineer to join our growing team of analytics experts. As a data engineer, you are responsible for designing and implementing our data pipeline architecture and optimizing data flow and collection for cross-functional groups, considering scalability in mind. Data engineering is about building the underlying infrastructure, and so being able to pass the limelight to someone else is imperative. Required Skills: Hands-on experience in Data Integration and Data Warehousing Strong proficiency in: Google BigQuery Python SQL Airflow/Cloud Composer Ascend or any modern ETL tool Experience with data quality frameworks or custom-built validations Preferred Skills: Knowledge of DBT for data transformation and modeling Familiarity with Collibra for data cataloging and governance Qualifications: Advanced working SQL knowledge and experience working with relational databases and working familiarity with a variety of databases. Strong analytic skills related to working with unstructured datasets. Experience building a serverless data warehouse in GCP or AWS 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Strong analytic skills related to working with unstructured datasets Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Design, build, and optimize data pipelines using Google BigQuery, ensuring use of best practices such as query optimization,partitioning, clustering, and scalable data modeling. Develop robust ETL/ELT processes using Python and SQL, with an emphasis on reliability, performance, and maintainability. Create and manage Ascend or equivalent tool data flows, including: Setting up read/write connectors for various data sources. Implementing custom connectors using Python. Managing scheduling, failure notifications, and data services within Ascend. Implement data quality checks (technical and business level) and participate in defining data testing strategies to ensure data reliability. Perform incremental loads and merge operations in BigQuery. Build and manage Airflow (Cloud Composer) DAGs, configure variables, and handle scheduling as part of orchestration. Work within a CI/CD (DevSecOps) setup to promote code efficiently across environments. Participate in technical solutioning: Translate business integration needs into technical user stories. Contribute to technical design documents and provide accurate estimations. Conduct and participate in code reviews, enforce standards, and mentor junior engineers. Collaborate with QA and business teams during UAT; troubleshoot and resolve issues in development, staging, and production environments. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Engineer, Software Engineering Hyderabad, India Information Technology 311642 Job Description About The Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

Exploring Airflow Jobs in India

The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Gurgaon

Average Salary Range

The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead

Related Skills

In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing

Interview Questions

  • What is Apache Airflow? (basic)
  • Explain the key components of Airflow. (basic)
  • How do you schedule a DAG in Airflow? (basic)
  • What are the different operators in Airflow? (medium)
  • How do you monitor and troubleshoot DAGs in Airflow? (medium)
  • What is the difference between Airflow and other workflow management tools? (medium)
  • Explain the concept of XCom in Airflow. (medium)
  • How do you handle dependencies between tasks in Airflow? (medium)
  • What are the different types of sensors in Airflow? (medium)
  • What is a Celery Executor in Airflow? (advanced)
  • How do you scale Airflow for a high volume of tasks? (advanced)
  • Explain the concept of SubDAGs in Airflow. (advanced)
  • How do you handle task failures in Airflow? (advanced)
  • What is the purpose of a TriggerDagRun operator in Airflow? (advanced)
  • How do you secure Airflow connections and variables? (advanced)
  • Explain how to create a custom Airflow operator. (advanced)
  • How do you optimize the performance of Airflow DAGs? (advanced)
  • What are the best practices for version controlling Airflow DAGs? (advanced)
  • Describe a complex data pipeline you have built using Airflow. (advanced)
  • How do you handle backfilling in Airflow? (advanced)
  • Explain the concept of DAG serialization in Airflow. (advanced)
  • What are some common pitfalls to avoid when working with Airflow? (advanced)
  • How do you integrate Airflow with external systems or tools? (advanced)
  • Describe a challenging problem you faced while working with Airflow and how you resolved it. (advanced)

Closing Remark

As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies