Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Company Description Make an impact at a global and dynamic investment organization When you join CPP Investments, you are joining one of the world’s most admired and respected institutional investors. With more than $600 billion in assets under management, CPP Investments is a professional investment management organization that globally invests the funds of the Canada Pension Plan (CPP) to help ensure it is financially sustainable for generations of working and retired Canadians. CPP Investments invests across regions and asset classes to build a globally diversified portfolio. It holds assets in public equity, private equity, real estate, infrastructure, and fixed income, and the CPP Fund is projected to reach $3 trillion in assets by 2050. The organization is headquartered in Toronto with offices in Hong Kong, London, Mumbai, New York City, San Francisco, São Paulo, and Sydney. CPP Investments successfully attracts, selects, and retains talented individuals from top-tier institutions worldwide. Join our team for access to: Stimulating work in a fast-paced and intellectually challenging environment Accelerated exposure and responsibility Global career development opportunities Diverse and inspiring colleagues and approachable leaders A hybrid-flexible work environment with an emphasis on in-person collaboration A culture rooted in principles of integrity, partnership, and high performance An organization with an important social purpose that positively impacts lives If you have a passion for performance, value a collegial and collaborative culture, and approach work with the highest integrity, invest your career here. Job Description As a Senior Engineer you will design, build, test and support value-add technology solutions and products. You will also work in close partnership with business and technology teams and will acquire sufficient understanding of business and technology to apply critical thought to business and technology requests. Role Specific Accountabilities Design, build, test and support medium complexity technology solutions or enhancements with independence, to enable business capabilities, in an Agile environment that includes business and T&D partner teams. Apply advanced understanding of engineering best practices and drive continuous improvement across the team through coaching and influencing Demonstrate a sufficient understanding of the technical landscape and the business capabilities it supports to apply critical thought to business requests Foster and enable agility and innovation through experimentation and early feedback ensuring responsiveness to evolving business needs Succinctly frame problems, engage appropriately with colleagues to think deeply about broad problems and gain buy-in on well-reasoned recommendations Facilitate root-cause-analysis of operational incidents impacting the products you support Demonstrate ability to independently research and master new complex technologies Adhere to Agile SDLC and execute related duties as required. Foster collaboration and mentorship promoting a culture of feedback, learning and professional growth Maintain strong relationships with business partners, peer IT teams and vendor partners Qualifications Undergraduate degree or college diploma in related field (e.g. Engineering, Computer Science). 7+ years of relevant experience. Strong Experience working with various programming languages (Python, C++, Java, etc.) Expertise in AWS services (EMR, Glue, Airflow, Lake formation, Iceberg, Lambda, API Gateway, SNS, SQS, Step Functions, S3, Data Zone, IAM & Security, VPC & Networking) Experience in API development and integration. Experience with front-end technologies (e.g., React, Angular) is a plus. Familiarity with AI/ML concepts and tools is a plus. Excellent problem-solving skills and attention to detail. Experience with software development concepts, including version control, testing methodologies, and agile development practices Ability to write clean, readable, and well-documented code, while paying attention to details and adhering to coding standards Additional Information Visit our LinkedIn Career Page or Follow us on LinkedIn. At CPP Investments, we are committed to diversity and equitable access to employment opportunities based on ability. We thank all applicants for their interest but will only contact candidates selected to advance in the hiring process. Our Commitment To Inclusion And Diversity In addition to being dedicated to building a workforce that reflects diverse talent, we are committed to fostering an inclusive and accessible experience. If you require an accommodation for any part of the recruitment process (including alternate formats of materials, accessible meeting rooms, etc.), please let us know and we will work with you to meet your needs. Disclaimer CPP Investments does not accept resumes from employment placement agencies, head-hunters or recruitment suppliers that are not in a formal contractual arrangement with us. Our recruitment supplier arrangements are restricted to specific hiring needs and do not include this or other web-site job postings. Any resume or other information received from a supplier not approved by CPP Investments to provide resumes to this posting or web-site will be considered unsolicited and will not be considered. CPP Investments will not pay any referral, placement or other fee for the supply of such unsolicited resumes or information. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
The candidate should have a strong understanding of HVAC systems and design principles and be able to apply their knowledge to create innovative and efficient solutions for different upcoming projects and proposals. Comprehensive knowledge of different types of HVAC systems and their applications. Hands-on experience in HVAC design software tools like HAP/Elite, duct sizer, pipe sizer, etc Ability to perform load calculations to determine system sizing and equipment selection. Perform reviews of vendor deliverables, raise TQs, TBEs and also participate in review meetings with vendors and other stakeholders. Ability to interpret P&IDs, D&IDs, and airflow diagrams. Familiarity with codes and standards pertaining to offshore HVAC systems and ducting, etc like ISO 15138, ISO 7547, ASHRAE, NFPA, SMACNA, etc. Key Responsibilities Design and Analysis: Develop HVAC system designs for various offshore HVAC (High Voltage Alternating Current) substations and HVDC (High Voltage Direct Current) Offshore Converter Platforms in Offshore Wind Farms across Europe, USA, Korea, Taiwan, etc. Conduct feasibility studies, load calculations, and energy efficiency assessments of platform HVA/C systems and temporary HVA/C systems required during fabrication at yard, transportation, and offshore HUC. Equipment Selection: Select appropriate HVAC equipment, components, and controls based on project requirements, budget constraints, and energy efficiency goals. Preparation of BOQ / MR / TBE: Prepare BOQ of HVA/C systems, ducting and piping quantities, etc., prepare MR for Supply Chain Management team to initiate procurement and evaluate vendor offers to suit the project specifications Drawing Preparation/Review: Create/review detailed HVAC system drawings, including schematics, layouts, and specifications from Engineering consultants, and ensure compliance with project specifications and applicable codes and standards. Code Compliance: Ensure that all designs comply with relevant codes, industry standards, and local regulations of the respective countries. Project Coordination: Collaborate with other department engineers, contractors, and fabrication yard during construction to ensure smooth project execution. Energy Efficiency: Implement strategies to optimize energy efficiency and reduce operating costs for HVAC systems. Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 12+ years of hands on experience Position: Senior Manager Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Deep expertise in AI/ML solution design, including supervised and unsupervised learning, deep learning, NLP, and optimization. Strong hands-on experience with ML/DL frameworks like TensorFlow, PyTorch, scikit-learn, H2O, and XGBoost. Solid programming skills in Python, PySpark, and SQL, with a strong foundation in software engineering principles. Proven track record of building end-to-end AI pipelines, including data ingestion, model training, testing, and production deployment. Experience with MLOps tools such as MLflow, Airflow, DVC, and Kubeflow for model tracking, versioning, and monitoring. Understanding of big data technologies like Apache Spark, Hive, and Delta Lake for scalable model development. Expertise in AI solution deployment across cloud platforms like GCP, AWS, and Azure using services like Vertex AI, SageMaker, and Azure ML. Experience in REST API development, NoSQL database design, and RDBMS design and optimizations. Familiarity with API-based AI integration and containerization technologies like Docker and Kubernetes. Proficiency in data storytelling and visualization tools such as Tableau, Power BI, Looker, and Streamlit. Programming skills in Python and either Scala or R, with experience using Flask and FastAPI. Experience with software engineering practices, including use of GitHub, CI/CD, code testing, and analysis. Proficient in using AI/ML frameworks such as TensorFlow, PyTorch, and SciKit-Learn. Skilled in using Apache Spark, including PySpark and Databricks, for big data processing. Strong understanding of foundational data science concepts, including statistics, linear algebra, and machine learning principles. Knowledgeable in integrating DevOps, MLOps, and DataOps practices to enhance operational efficiency and model deployment. Experience with cloud infrastructure services like Azure and GCP. Proficiency in containerization technologies such as Docker and Kubernetes. Familiarity with observability and monitoring tools like Prometheus and the ELK stack, adhering to SRE principles and techniques. Cloud or Data Engineering certifications or specialization certifications (e.g. Google Professional Machine Learning Engineer, Microsoft Certified: Azure AI Engineer Associate – Exam AI-102, AWS Certified Machine Learning – Specialty (MLS-C01), Databricks Certified Machine Learning) Nice To Have Experience implementing generative AI, LLMs, or advanced NLP use cases Exposure to real-time AI systems, edge deployment, or federated learning Strong executive presence and experience communicating with senior leadership or CXO-level clients Roles And Responsibilities Lead and oversee complex AI/ML programs, ensuring alignment with business strategy and delivering measurable outcomes. Serve as a strategic advisor to clients on AI adoption, architecture decisions, and responsible AI practices. Design and review scalable AI architectures, ensuring performance, security, and compliance. Supervise the development of machine learning pipelines, enabling model training, retraining, monitoring, and automation. Present technical solutions and business value to executive stakeholders through impactful storytelling and data visualization. Build, mentor, and lead high-performing teams of data scientists, ML engineers, and analysts. Drive innovation and capability development in areas such as generative AI, optimization, and real-time analytics. Contribute to business development efforts, including proposal creation, thought leadership, and client engagements. Partner effectively with cross-functional teams to develop, operationalize, integrate, and scale new algorithmic products. Develop code, CI/CD, and MLOps pipelines, including automated tests, and deploy models to cloud compute endpoints. Manage cloud resources and build accelerators to enable other engineers, with experience in working across two hyperscale clouds. Demonstrate effective communication skills, coaching and leading junior engineers, with a successful track record of building production-grade AI products for large organizations. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 4+ years of hands on experience Position: Senior Associate Industry: Supply Chain/Forecasting/Financial Analytics Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Strong supply chain domain knowledge (inventory planning, demand forecasting, logistics) Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Experience using at least one major cloud platform (AWS, Azure, GCP), such as: AWS: Experience with AWS SageMaker, Redshift, Glue, Lambda, QuickSight Azure: Experience with Azure ML Studio, Synapse Analytics, Data Factory, Power BI GCP: Experience with BigQuery, Vertex AI, Dataflow, Cloud Composer, Looker Experience developing, deploying, and monitoring ML models on cloud infrastructure Expertise in Python, SQL, data orchestration, and cloud-native data tools Hands-on experience with cloud-native data lakes and lakehouses (e.g., Delta Lake, BigLake) Familiarity with infrastructure-as-code (Terraform/CDK) for cloud provisioning Knowledge of visualization tools (PowerBI, Tableau, Looker) integrated with cloud backends Strong command of statistical modeling, testing, and inference Advanced capabilities in data wrangling, transformation, and feature engineering Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Airflow) Strong communication and stakeholder engagement skills at the executive level Roles And Responsibilities Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform? Are you an expert with Big Data Technologies? Have you looked under the hood of these systems? Are you interested in Open Source? If you answered “Yes” to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary We are seeking a skilled and proactive Data Engineer with a strong background in ETL development and a focus on integrating data quality frameworks. In this role, you will be responsible for designing, developing, and maintaining ETL pipelines while ensuring data quality is embedded throughout the process. You will play a crucial role in building robust and reliable data pipelines that deliver high-quality data to our data warehouse and other systems. Responsibilities Design, develop, and implement ETL processes to extract data from various source systems, transform it according to business requirements, and load it into target systems (e.g., data warehouse, data lake) Implement data validation and error handling within ETL pipelines. Build and maintain scalable, reliable, and efficient data pipelines. Design and implement data quality checks, validations, and transformations within ETL processes. Automate data quality monitoring, alerting, and reporting within ETL pipelines. Develop and implement data quality rules and standards within ETL processes. Integrate data from diverse sources, including databases, APIs, flat files, and cloud-based systems. Utilize ETL tools and technologies (e.g., SnapLogic, Informatica PowerCenter, Talend, AWS Glue, Apache Airflow, Azure Data Factory, etc.). Write SQL queries to extract, transform, load, and validate data. Use scripting languages (e.g., Python) to automate ETL processes, data quality checks, and data transformations. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Locations : Mumbai | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We’re a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world’s most complex problems. Leveraging BCG’s global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Build AI/ML technology stacks from concept to production, including data pipelines, model training, and deployment. Develop and optimize Generative AI workflows, including prompt engineering, fine-tuning (LoRA, QLoRA), retrieval-augmented generation (RAG), and LLM-based applications. Work with Large Language Models (LLMs) such as Llama, Mistral, and GPT, ensuring efficient adaptation for various use cases. Design and implement AI-driven automation using agentic AI systems and orchestration frameworks like Autogen, LangGraph, and CrewAI. Leverage cloud AI infrastructure (AWS, Azure, GCP) for scalable deployment and performance tuning. Collaborate with cross-functional teams to deliver AI-driven solutions. What You'll Bring Bachelor’s, Master’s, or PhD in Computer Science, Data Science, Mathematics, Statistics, Engineering or a related field. 2+ years of experience in AI/ML, with expertise in Generative AI and LLMs. Strong proficiency in Python and experience with AI/ML frameworks like PyTorch and TensorFlow. Knowledge of advanced prompt engineering techniques (Chain of Thought, Few-Shot, Self-Consistency). Experience in AI workflow automation and model orchestration. Hands-on experience with API development using Flask or Django. Familiarity with data processing frameworks like Databricks and Airflow. Strong analytical skills and the ability to work in a collaborative environment #BCGXjob Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
5.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for Data Scientist at Tata Consultancy Services at 14th June 2025. Hiring For : Data Scientist Mandatory Skills: Data Science & CICD DevOps & Databricks, AWS, SQL, Pyspark/Python,AWS, Databricks, Snowflak,S3, EMR, EC2, Airflow, Lambda Walk in Location: Pune Experience : 5-15 years Mode of interview: in-person walk in drive Date of interview: 14 June 2025 Venue : Zone 3 Auditorium, Tata Consultancy Services, Sahyadri Park, Rajiv Gandhi Infotech Park, Hinjewadi Phase 3, Pune – 411057 If you are interested in this exciting opportunity, Please share your updated resume on archana.parmeshwar@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: Current Organization Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Willing to attend walk in on 14th June: (Yes/No) Note: only Eligible candidates with Relevant experience will be contacted further Thanks & Regards, Archana Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Fynd is India’s largest omnichannel platform and a multi-platform tech company specializing in retail technology and products in AI, ML, big data, image editing, and the learning space. It provides a unified platform for businesses to seamlessly manage online and offline sales, store operations, inventory, and customer engagement. Serving over 2,300 brands, Fynd is at the forefront of retail technology, transforming customer experiences and business processes across various industries. About You As a TPM you will be acting as a bridge between business and engineering teams. You will working on complex business constraints that translate into product requirements and features. You bring technical knowledge to the team, taking the project from prototype to launch in tight timelines. A people’s person who can provide strong leadership and inspire teams to build a world-class products. What will you do at Fynd? Gather requirements from diverse teams and stakeholders Work with Platform Architects and Engineers to convert these requirements to an implementation Work closely with engineers to prioritize product features and requirements Own the execution of the sprint by collaborating with multiple engineering and product teams within the org Be responsible for the delivery of these sprints both from a timeline and quality point of view Manage technical and product risks and unblock engineering by helping mitigate them Provide accurate visibility in terms of features readiness, issues with other engineering teams Who are we looking for? You, if you can walk the talk and convince others to walk with you Someone who can distinguish between the important and the urgent, and make sure both are addressed A leader who has the vision to see more than 14 million outcomes and pick the one where Thanos is defeated and Avengers thrive Juggling time, resources and priorities feels as natural as data charts and spreadsheets Someone who listens to everybody, distils information and makes stuff happen! Some Specific Requirements Basic Knowledge of Technology, data orchestration tools and frameworks such as Apache Airflow,API Integrations, Micro-services Architecture, CI/CD etc Strong communication skills Knowledge of data modeling and ETL (Extract, Transform, Load) processes. Familiarity with data streaming and real-time data processing technologies. Proficiency in data visualization tools (e.g., Tableau, Power BI) to create reports and dashboards. Ability to automate repetitive tasks and workflows using scripting or automation tools. A commitment to staying current with evolving data technologies and industry trends. Ability to explain technical concepts/flows to a non-technical audience Clear written communication skills. You must be able to clearly articulate flows for engineers and SDETs to understand deeply Build strong relationships and collaborate with a diverse team containing engineering, product and business stakeholders Effective Delegation Must know how to build ownership and execution within the team without micro-managing Proficiency with data platform technologies, including database management systems (e.g., MySQL, PostgreSQL, or MongoDB). Knowledge of server and storage hardware, virtualization, and cloud computing. Strong Attention to Detail Growth Mindset to learn skills while performing the role 4 - 7years of experience in a Business Analyst/Project Manager role Some experience with Bug Tracking tools like JIRA, Confluence, Asana, Redmine or Azure Devops Strong analytical and problem-solving skills, with the ability to make data-driven decisions. Excellent communication and collaboration skills, including the ability to work effectively with cross-functional teams. Strong knowledge of data technologies, databases, and data analytics tools. Familiarity with cloud-based data solutions (e.g., AWS, Azure, GCP). Strong knowledge of ETL tools and techniques, including data extraction, transformation, and loading from various sources. Experience with Change Data Capture (CDC) methodologies to capture real-time data changes for synchronization. Deep understanding of machine learning concepts and their application to data-driven decision-making. Proficiency in data integration tools, including Big DataOps Platforms, to streamline data collection and management. Familiarity with workflow management systems for process automation and orchestration. Knowledge of artificial intelligence (AI) technologies and their integration into data platforms to enhance automation, prediction, and decision support. Strong problem-solving skills to address complex technical and business challenges. Ability to communicate and present complex technical concepts to non-technical stakeholders. Leadership skills to guide cross-functional teams in product development. What do we offer? Growth Growth knows no bounds, as we foster an environment that encourages creativity, embraces challenges, and cultivates a culture of continuous expansion. We are looking at new product lines, international markets and brilliant people to grow even further. We teach, groom and nurture our people to become leaders. You get to grow with a company that is growing exponentially. Flex University: We help you upskill by organising in-house courses on important subjects Learning Wallet: You can also do an external course to upskill and grow, we reimburse it for you. Culture Community and Team building activities Host weekly, quarterly and annual events/parties. Wellness Mediclaim policy for you + parents + spouse + kids Experienced therapist for better mental health, improve productivity & work-life balance We work from the office 5 days a week to promote collaboration and teamwork. Join us to make an impact in an engaging, in-person environment! Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderābād
On-site
JOB DESCRIPTION You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.
Posted 1 week ago
5.0 years
5 - 7 Lacs
Hyderābād
On-site
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years of applied experience Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis
Posted 1 week ago
3.0 - 5.0 years
5 - 8 Lacs
Hyderābād
On-site
Job Title: Sr. Software/Sr.Data Engineer – Databricks & PySpark Experience: 3–5 Years, Work Location : Hyderabad Job Summary: We are seeking a skilled Data Engineer with 3–5 years of hands-on experience in Databricks and PySpark . Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks and PySpark.Collaborate with data scientists, analysts, and other stakeholders to gather requirements and deliver data solutions.Implement ETL processes to ingest and transform structured and unstructured data from various sources.Perform data wrangling, cleansing, and validation to ensure quality and accuracy.Monitor, troubleshoot, and optimize performance of data pipelines and workflows.Ensure adherence to data governance and security policies. Required Skills: 3–4 years of hands-on experience in PySpark for large-scale data processing.Strong experience with Databricks platform (including notebooks, clusters, and Delta Lake).Proficiency in SQL and Python.Experience with cloud platforms (preferably Azure or AWS).Good understanding of data modelling, partitioning, and performance tuning.Familiarity with version control tools (e.g., Git) and CI/CD practices.Good communication skills Preferred Skills (Good to Have): Knowledge of Delta Lake, Apache Airflow, or Data Factory.Exposure to Agile/Scrum methodologies.Experience working with streaming data is a plus. Interested candidates can share their updated profiles to radhika.laxmi@komhar.com Job Type: Full-time Pay: ₹503,603.23 - ₹800,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift UK shift Work Location: In person Expected Start Date: 06/06/2025
Posted 1 week ago
5.0 years
50 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 5.00 + years Salary : INR 5000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Precanto) (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. We’re looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering We’re looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) What You Will Do- Build and optimize machine learning models — from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform Basic Skills Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. Advanced Skills Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
5.0 years
4 - 8 Lacs
Hyderābād
On-site
Senior Backend Software Engineer-Python Hyderabad, India Business Management 312713 Job Description About The Role: Grade Level (for internal use): 03 Who We Are Kensho is a 120-person AI and machine learning company within S&P Global. With expertise in Machine Learning and data discovery, we develop and deploy novel solutions for S&P Global and its customers worldwide. Our solutions help businesses harness the power of data and Artificial Intelligence to innovate and drive progress. Kensho's solutions and research focus on speech recognition, entity linking, document extraction, automated database linking, text classification, natural language processing, and more. Are you looking to solve hard problems and enjoy working with teammates with diverse perspectives? If so, we would love to help you excel here at Kensho. About The Team Kensho’s Applications group develops the web apps and APIs that deliver Kensho’s AI capabilities to our customers. Our teams are small, product-focused, and intent on shipping high-quality code that best leverages our efforts. We’re collegial, humble, and inquisitive, and we delight in learning from teammates with backgrounds, skills, and interests different from our own. Kensho Link team, within the Applications Department, is a machine learning service that allows users to map entities in their datasets with unique entities drawn from S&P Global’s world-class company database with precision and speed. Link started as an internal Kensho project to help S&P Global Market Intelligence Team to integrate datasets more quickly into their platform. It uses ML based algorithms trained to return high quality links, even when the data inputs are incomplete or contain errors. In simple words, Kensho’s Link product helps in connecting the disconnected information about a company at one place – and it does so with scale. Link leverages a variety of NLP and ML techniques to process and link millions of company entities in hours. About The Role As a Senior Backend Engineer you will develop reliable, secure, and performant APIs that apply Kensho’s AI capabilities to specific customer workflows. You will collaborate with colleagues from Product, Machine Learning, Infrastructure, and Design, as well as with other engineers within Applications. You have a demonstrated capacity for depth, and are comfortable working with a broad range of technologies. Your verbal and written communication is proactive, efficient, and inclusive of your geographically-distributed colleagues. You are a thoughtful, deliberate technologist and share your knowledge generously. Equivalent to Grade 11 Role (Internal) You will: Design, develop, test, document, deploy, maintain, and improve software Manage individual project priorities, deadlines, and deliverables Work with key stakeholders to develop system architectures, API specifications, implementation requirements, and complexity estimates Test assumptions through instrumentation and prototyping Promote ongoing technical development through code reviews, knowledge sharing, and mentorship Optimize Application Scaling: Efficiently scale ML applications to maximize compute resource utilization and meet high customer demand. Address Technical Debt: Proactively identify and propose solutions to reduce technical debt within the tech stack. Enhance User Experiences: Collaborate with Product and Design teams to develop ML-based solutions that enhance user experiences and align with business goals. Ensure API security and data privacy by implementing best practices and compliance measures. Monitor and analyze API performance and reliability, making data-driven decisions to improve system health. Contribute to architectural discussions and decisions, ensuring scalability, maintainability, and performance of the backend systems. Qualifications At least 5+ years of direct experience developing customer-facing APIs within a team Thoughtful and efficient communication skills (both verbal and written) Experience developing RESTful APIs using a variety of tools Experience turning abstract business requirements into concrete technical plans Experience working across many stages of the software development lifecycle Sound reasoning about the behavior and performance of loosely-coupled systems Proficiency with algorithms (including time and space complexity analysis), data structures, and software architecture At least one domain of demonstrable technical depth Familiarity with CI/CD practices and tools to streamline deployment processes. Experience with containerization technologies (e.g., Docker, Kubernetes) for application deployment and orchestration. Technologies We Love Python, Django, FastAPI mypy, OpenAPI RabbitMQ, Celery, Kafka OpenSearch, PostgreSQL, Redis Git, Jsonnet, Jenkins, Docker, Kubernetes Airflow, AWS, Terraform Grafana, Prometheus ML Libraries: PyTorch, Scikit-learn, Pandas What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 312713 Posted On: 2025-04-15 Location: Hyderabad, Telangana, India
Posted 1 week ago
10.0 years
6 - 9 Lacs
Hyderābād
On-site
Lead, Application Development Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 316185 Job Description About The Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316185 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are hiring for our client - based in Hyderabad- GCC - who are yet to establish their presence in India. Job Summary : We are looking for a Senior Data Engineer to join our growing team of analytics experts. As a data engineer, you are responsible for designing and implementing our data pipeline architecture and optimizing data flow and collection for cross-functional groups, considering scalability in mind. Data engineering is about building the underlying infrastructure, and so being able to pass the limelight to someone else is imperative. Required Skills: Hands-on experience in Data Integration and Data Warehousing Strong proficiency in: Google BigQuery Python SQL Airflow/Cloud Composer Ascend or any modern ETL tool Experience with data quality frameworks or custom-built validations Preferred Skills: Knowledge of DBT for data transformation and modeling Familiarity with Collibra for data cataloging and governance Qualifications: Advanced working SQL knowledge and experience working with relational databases and working familiarity with a variety of databases. Strong analytic skills related to working with unstructured datasets. Experience building a serverless data warehouse in GCP or AWS 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Strong analytic skills related to working with unstructured datasets Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Design, build, and optimize data pipelines using Google BigQuery, ensuring use of best practices such as query optimization,partitioning, clustering, and scalable data modeling. Develop robust ETL/ELT processes using Python and SQL, with an emphasis on reliability, performance, and maintainability. Create and manage Ascend or equivalent tool data flows, including: Setting up read/write connectors for various data sources. Implementing custom connectors using Python. Managing scheduling, failure notifications, and data services within Ascend. Implement data quality checks (technical and business level) and participate in defining data testing strategies to ensure data reliability. Perform incremental loads and merge operations in BigQuery. Build and manage Airflow (Cloud Composer) DAGs, configure variables, and handle scheduling as part of orchestration. Work within a CI/CD (DevSecOps) setup to promote code efficiently across environments. Participate in technical solutioning: Translate business integration needs into technical user stories. Contribute to technical design documents and provide accurate estimations. Conduct and participate in code reviews, enforce standards, and mentor junior engineers. Collaborate with QA and business teams during UAT; troubleshoot and resolve issues in development, staging, and production environments. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
6 - 9 Lacs
Hyderābād
On-site
Engineer, Software Engineering Hyderabad, India Information Technology 311642 Job Description About The Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India
Posted 1 week ago
10.0 years
3 - 7 Lacs
Gurgaon
On-site
About the Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316190 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Gurgaon
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Your primary objective is to ensure project goals are achieved and are aligned with business objectives. You will also work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via standard Agile methodologies. You will mentor and help other team members to deliver the scrum team objectives. Responsibilities Build scalable, reliable, cost-effective solutions for both the Cloud and on-premises. Understanding of current technologies and experience with legacy technologies. Understands when the architecture needs to change to meet requirements. Understand system test principles and best practices. Create reusable code and components for audio software development, to be utilized across various projects. Provide cloud integration development support to various project teams. Rapidly identify and resolve technical incidents as they emerge Collaborate effectively with Data Science to understand, translate, and integrate methodologies into engineering build pipelines. Key Skills (Domain Expertise) 4-8 years of related experience with a Bachelor’s degree or equivalent work experience. Must have strong analytical and technical skills in troubleshooting and problem resolution Technical Skills 4-8 years of hands-on software development with a bachelor’s degree. Experience in software development using programming languages & tools/services: .Net Programming (c#, Basic, asp.net), Java, Python, JavaScript and strong in SQL. Experience with orchestration tools: Apache Airflow or similar tools Strong knowledge on Windows, Unix/Linux OS, commands, shell scripting, python Strong experience in Java/J2EE, Spring boot/cloud frameworks Agile scrum experience in application development experience is required. Strong knowledge in SQLServer and/or Oracle Deployment and automation: CI/CD Pipeline Knowledge in Gitlab /Bitbucket . AWS Programming Certification is a plus. Ability to quickly learn vendor owned Computer Aided Telephone Interviewing (CATI). Mindset and Attributes Strong verbal and written communication skills. Strong analytical and technical skills in troubleshooting and problem resolution Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law.
Posted 1 week ago
2.0 years
4 - 8 Lacs
Gurgaon
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. About the role:- This role will be part of a team that develops software that processes data captured every day from over a quarter of a million Computer and Mobile devices worldwide. Measuring panelists activities as they surf the Internet via Browsers, or utilizing Mobile App’s download from Apple’s and Google’s store. The Nielsen software meter used to capture this usage data has been optimized to be unobtrusive yet gather many biometric data points that the backend system can use to identify who is using the device, and also detect fraudulent behavior. The Software Engineer is ultimately responsible for delivering technical solutions: starting from the project's onboard until post launch support and including development and testing. It is expected to coordinate, support and work with multiple delocalized project teams in multiple regions. As a Software Engineer with our Digital Meter Processing team, you will further develop the backend system that processes massive amounts of data every day, across 3 different AWS regions. Your role will involve implementing, and maintaining robust, scalable solutions that leverage a Java based system that runs in an AWS environment. You will play a key role in shaping the technical direction of our projects and mentoring other team members. Responsibilities:- System Deployment: Build new features in the existing backend processing pipelines. CI/CD Implementation: Leverage CI/CD pipelines for automated build, test, and deployment processes. Ensure continuous integration and delivery of features, improvements, and bug fixes. Code Quality and Best Practices: Adhere to coding standards, best practices, and design principles. Participate in code reviews and provide constructive feedback to maintain high code quality. Performance Optimization: Identify and address performance bottlenecks in both reading, processing and writing data to the backend data stores. Team Collaboration: Follow best practices. Collaborate with cross-functional teams to ensure a cohesive and unified approach to software development. Security and Compliance: Implement security best practices for all tiers of the system. Ensure compliance with industry standards and regulations related to AWS platform security. Key Skills:- Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field. Proven experience, minimum 2 years, in Java development expertise and scripting languages such as Python in an AWS Cloud environment. Good experience with SQL and a database system such as Postgres. Good understanding of CI/CD principles and tools. GitLab a plus. Good problem-solving and debugging skills. Good communication and collaboration skills with ability to communicate complex technical concepts and align organization on decisions. Utilizes team collaboration to contribute to innovative solutions efficiently Other desirable skills:- Knowledge of networking principles and security best practices. AWS certifications. Experience with Data Warehouses, ETL, and/or Data Lakes very helpful. Experience with RedShift, Airflow, Python, Lambda, Prometheus, Grafana, & OpsGeni a bonus Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law.
Posted 1 week ago
6.0 years
4 - 6 Lacs
Gurgaon
On-site
Senior ML Engineer Gurgaon, India; Ahmedabad, India; Hyderabad, India; Noida, India Information Technology 315679 Job Description About The Role: Grade Level (for internal use): 10 The Team : As a member of the Data Transformation - Cognitive Engineering team you will work on building and deploying ML powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global Market Intelligence and our clients. You will spearhead deployment of AI products and pipelines while leading-by-example in a highly engaging work environment. You will work in a (truly) global team and encouraged for thoughtful risk-taking and self-initiative. What’s in it for you: Be a part of a global company and build solutions at enterprise scale Lead a highly skilled and technically strong team (including leadership) Contribute to solving high complexity, high impact problems Build production ready pipelines from ideation to deployment Responsibilities: Design, Develop and Deploy ML powered products and pipelines Mentor a team of Senior and Junior data scientists / ML Engineers in delivering large scale projects Play a central role in all stages of the AI product development life cycle, including: Designing Machine Learning systems and model scaling strategies Research & Implement ML and Deep learning algorithms for production Run necessary ML tests and benchmarks for model validation Fine-tune, retrain and scale existing model deployments Extend existing ML library’s and write packages for reproducing components Partner with business leaders, domain experts, and end-users to gain business understanding, data understanding, and collect requirements Interpret results and present them to business leaders Manage production pipelines for enterprise scale projects Perform code reviews & optimization for your projects and team Lead and mentor by example, including project scrums Technical Requirements: Proven track record as a senior / lead ML engineer Expert proficiency in Python (Numpy, Pandas, Spacy, Sklearn, Pytorch/TF2, HuggingFace etc.) Excellent exposure to large scale model deployment strategies and tools Excellent knowledge of ML & Deep Learning domain Solid exposure to Information Retrieval, Web scraping and Data Extraction at scale Exposure to the following technologies - R-Shiny/Dash/Streamlit, SQL, Docker, Airflow, Redis, Celery, Flask/Django/FastAPI, PySpark, Scrapy Experience with SOTA models related to NLP and expertise in text matching techniques, including sentence transformers, word embeddings, and similarity measures Open to learning new technologies and programming languages as required A Master’s / PhD from a recognized institute in a relevant specialization Good to have: 6-7+ years of relevant experience in ML Engineering Prior substantial experience from the Economics/Financial industry Prior work to show on Github, Kaggle, StackOverflow etc. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 315679 Posted On: 2025-05-20 Location: Gurgaon, Haryana, India
Posted 1 week ago
8.0 years
4 - 8 Lacs
Gurgaon
On-site
Job ID: 1257 Location: Hybrid, Gurgaon, Haryana, IN Job Family: Research and Development Job Type: Permanent Employment Type: Full Time About Us Innovation. Sustainability. Productivity. This is how we are Breaking New Ground in our mission to sustainably advance the noble work of farmers and builders everywhere. With a growing global population and increased demands on resources, our products are instrumental to feeding and sheltering the world. From developing products that run on alternative power to productivity-enhancing precision tech, we are delivering solutions that benefit people – and they are possible thanks to people like you. If the opportunity to build your skills as part of a collaborative, global team excites you, you’re in the right place. Grow a Career. Build a Future! Be part of this company at the forefront of agriculture and construction, that passionately innovates to drive customer efficiency and success. And we know innovation can’t happen without collaboration. So, everything we do at CNH Industrial is about reaching new heights as one team, always delivering for the good of our customers. Job Purpose The CFD Analysis Engineer will be responsible for providing fluid/thermal analysis for agricultural (tractors, combines, harvesters, sprayers) and construction machines (excavators, wheel loaders, loader backhoes). As a member of the CFD Team, he will be supporting the design of components and subsystems like: A/C & HVAC systems Engine cooling packages Hydraulics Transmissions Engine air intakes & exhausts Key Responsibilities Develops virtual simulation models using CFD (Computational Fluid Dynamics) for the evaluation of engineering designs of agricultural and construction machinery. Makes recommendations to peers and direct manager based on sound engineering principles, practices and judgment pertaining to thermal/fluid problems as a contribution to the overall engineering and manufacturing objectives. Utilizes Star CCM+, Ensight, ANSYS Fluent, GT-Power, Actran Creo, TeamCenter and relevant software to develop and simulate designs for cooling packages, exhaust systems, engine air intakes, HVAC systems, transmissions, and other relevant components being developed and/or improved. Performs engineering calculations for emissions, chemical reactions, sprays, thermal, airflow, hydraulic, aero-acoustic, particle flows, and refrigeration problems to determine the size and performance of assemblies and parts and to solve design problems. Incorporates engineering standards, methodologies and global product development processes into daily work tasks. Experience Required MS Degree in Engineering or comparable program, with 8 years of professional industry experience. Good knowledge of the Computational Fluid Dynamics field. Some knowledge in the areas of underhood engine cooling, two-phase flows, and climatization. Knowledge of exhaust after treatment analysis; familiarity with SCR (Selective Catalytic Reactors), DPF (Diesel Particulate Filters), or DOC (Diesel Oxidation Catalysts). Some basic knowledge and understanding of aero-acoustics and fan noise Preferred Qualifications Master’s degree in mechanical engineering from reputed institute Doctoral degree (Ph.D.) is a plus What We Offer We offer dynamic career opportunities across an international landscape. As an equal opportunity employer, we are committed to delivering value for all our employees and fostering a culture of respect. At CNH, we understand that the best solutions come from the diverse experiences and skills of our people. Here, you will be empowered to grow your career, to follow your passion, and help build a better future. To support our employees, we offer regional comprehensive benefits, including: Flexible work arrangements Savings & Retirement benefits Tuition reimbursement Parental leave Adoption assistance Fertility & Family building support Employee Assistance Programs Charitable contribution matching and Volunteer Time Off
Posted 1 week ago
40.0 years
0 Lacs
Greater Kolkata Area
Remote
Who We Are Escalent is an award-winning data analytics and advisory firm that helps clients understand human and market behaviors to navigate disruption. As catalysts of progress for more than 40 years, our strategies guide the world’s leading brands. We accelerate growth by creating a seamless flow between primary, secondary, syndicated, and internal business data, providing consulting and advisory services from insights through implementation. Based on a profound understanding of what drives human beings and markets, we identify actions that build brands, enhance customer experiences, inspire product innovation and boost business productivity. We listen, learn, question, discover, innovate, and deliver—for each other and our clients—to make the world work better for people. Why Escalent? Once you join our team you will have the opportunity to... Access experts across industries for maximum learning opportunities including Weekly Knowledge Sharing Sessions, LinkedIn Learning, and more. Gain exposure to a rich variety of research techniques from knowledgeable professionals. Enjoy a remote first/hybrid work environment with a flexible schedule. Obtain insights into the needs and challenges of your clients—to learn how the world’s leading brands use research. Experience peace of mind working for a company with a commitment to conducting research ethically. Build lasting relationships with fun colleagues in a culture that values each person. Role Overview: We are looking for a Data Engineer to design, build, and optimize scalable data pipelines and infrastructure that power analytics, machine learning, and business intelligence. You will work closely with data scientists, analysts, and software engineers to ensure efficient data ingestion, transformation, and management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to extract, transform, and load data from diverse sources Build and optimize data storage solutions using SQL and NoSQL databases, data lakes, and cloud warehouses (Snowflake, BigQuery, Redshift) Ensure data quality, integrity, and security through automated validation, governance, and monitoring frameworks Collaborate with data scientists and analysts to provide clean, structured, and accessible data for reporting and AI/ML models Implement best practices for performance tuning, indexing, and query optimization to handle large-scale datasets Stay updated with emerging data engineering technologies, architectures, and industry best practices Write clean and structured code as defined in the team’s coding standards and creating documentation for best practices Stay updated with emerging technologies, frameworks, and industry trends Required Skills: Strong proficiency in Python, SQL, and data processing frameworks (Pandas, Spark, Hadoop) Experience with cloud-based data platforms (AWS, Azure, GCP) and services like S3, Glue, Athena, Data Factory, or BigQuery Solid understanding of database design, data modeling and warehouse architectures Hands-on experience with ETL/ELT pipelines and workflow orchestration tools (Apache Airflow, Prefect, Luigi) Knowledge of APIs, RESTful services and integrating multiple data sources Strong problem-solving and debugging skills in handling large-scale data processing challenges Experience with version control systems (Git, GitHub, GitLab) Ability to work in a team setting Organizational and time management skills Desirable skills: Experience working with Agile development methodologies Experience in building self-service data platforms for business users and analysts Effective skills in written and verbal communication Show more Show less
Posted 1 week ago
5.0 years
5 - 7 Lacs
Bengaluru
On-site
Bengaluru, Karnataka Job ID 30181211 Job Category Digital Technology Country: India Location: Ecospace Campus 3A, 4th Floor, Outer Ring Road, Bellandur, Bengaluru- 560103 Job Title - Lead Engineer Preferred Location - Bangalore Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Description: Established Enterprise Business Systems professional. Responsible for coordinating and/or performing work associated with digital business analysis. Responsibilities: Design and implement robust, reusable, and scalable data pipelines and transformations using AWS Glue, Kinesis, and orchestration tools like Airflow and Nexla Contribute to the evolution of our transactional data lake architecture based on Apache Iceberg and Amazon S3, enabling schema evolution, ACID transactions, and time travel. Partner closely with data product teams, architects, and platform engineers to deliver curated, governed datasets that accelerate insights and machine learning outcomes. Champion Data Ops and DevOps principles by implementing CI/CD workflows, observability, and automated testing for pipelines. Act as a subject matter expert for data modeling, schema design, and data quality assurance across diverse business domains. Mentor junior engineers and contribute to the development of internal standards, frameworks, and best practices for data engineering. Basic Qualifications & Experience: Requires theoretical to advanced knowledge obtained through a university degree, combined with experience Practical knowledge of Carrier organization, programs or systems with the ability to make enhancements and leverage in daily work. University Degree or equivalent A minimum of 5 years prior relevant experience An advanced degree in a related field and a minimum of 3 years experience Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.