Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Requirements 4+ years of experience as a Data Engineer. Strong proficiency in SQL. Hands-on experience with modern cloud data warehousing solutions (Snowflake, Big Query, Redshift) Expertise in ETL/ELT processes, batch, and streaming data processing. Proven ability to troubleshoot data issues and propose effective solutions. Knowledge of AWS services (S3, DMS, Glue, Athena). Familiarity with DBT for data transformation and modeling. Must be fluent in English communication. Desired Experience Experience with additional AWS services (EC2, ECS, EKS, VPC, IAM). Knowledge of Infrastructure as Code (IaC) tools like Terraform and Terragrunt. Proficiency in Python for data engineering tasks. Experience with orchestration tools like Dagster, Airflow, or AWS Step Functions. Familiarity with pub-sub, queuing, and streaming frameworks (AWS Kinesis, Kafka, SQS, SNS). Experience with CI/CD pipelines and automation for data processes. Skills: sns,data,snowflake,terraform,data engineer,big query,redshift,sqs,dagster,ci,etl,aws step functions,elt,cd,python,aws kinesis,dms,s3,cloud,airflow,ci/cd,dbt,glue,terragrunt,kafka,sql,athena,aws Show more Show less
Posted 2 days ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Hiring for US based Multinational Company (MNC) We are seeking a skilled and detail-oriented Data Engineer to join our team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. You will work closely with data scientists, analysts, and software engineers to ensure that high-quality data is readily available and usable. Design and implement scalable, reliable, and efficient data pipelines for processing and transforming large volumes of structured and unstructured data. Build and maintain data architectures including databases, data warehouses, and data lakes. Collaborate with data analysts and scientists to support their data needs and ensure data integrity and consistency. Optimize data systems for performance, cost, and scalability. Implement data quality checks, validation, and monitoring processes. Develop ETL/ELT workflows using modern tools and platforms. Ensure data security and compliance with relevant data protection regulations. Monitor and troubleshoot production data systems and pipelines. Proven experience as a Data Engineer or in a similar role Strong proficiency in SQL and at least one programming language such as Python, Scala, or Java Experience with data pipeline tools such as Apache Airflow, Luigi, or similar Familiarity with modern data platforms and tools: Big Data: Hadoop, Spark Data Warehousing: Snowflake, Redshift, BigQuery, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB Experience with cloud platforms (AWS, Azure, or GCP) Knowledge of data modeling, schema design, and ETL best practices Strong analytical and problem-solving skills
Posted 2 days ago
4.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Requirements 4+ years of experience as a Data Engineer. Strong proficiency in SQL. Hands-on experience with modern cloud data warehousing solutions (Snowflake, Big Query, Redshift) Expertise in ETL/ELT processes, batch, and streaming data processing. Proven ability to troubleshoot data issues and propose effective solutions. Knowledge of AWS services (S3, DMS, Glue, Athena). Familiarity with DBT for data transformation and modeling. Must be fluent in English communication. Desired Experience Experience with additional AWS services (EC2, ECS, EKS, VPC, IAM). Knowledge of Infrastructure as Code (IaC) tools like Terraform and Terragrunt. Proficiency in Python for data engineering tasks. Experience with orchestration tools like Dagster, Airflow, or AWS Step Functions. Familiarity with pub-sub, queuing, and streaming frameworks (AWS Kinesis, Kafka, SQS, SNS). Experience with CI/CD pipelines and automation for data processes. Skills: sns,data,snowflake,terraform,data engineer,big query,redshift,sqs,dagster,ci,etl,aws step functions,elt,cd,python,aws kinesis,dms,s3,cloud,airflow,ci/cd,dbt,glue,terragrunt,kafka,sql,athena,aws Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Budget: 3.5x Notice : Immediate joiners Requirements : • BS degree in computer science, computer engineering or equivalent • 5-9 years of experience delivering enterprise software solutions • Familiar with Spark, Scala, Python, AWS Cloud technologies • 2+ years of experience across multiple Hadoop / Spark technologies such as Hadoop, MapReduce, HDFS, HBase, Hive, Flume, Sqoop, Kafka, Scala • Flair for data, schema, data model, how to bring efficiency in big data related life cycle. • Experience with Agile Development methodologies. • Experience with data ingestion and transformation • Have understanding for secure application development methodologies. • Experience in with Airflow and Python will be preferred. • Understanding of automated QA needs related to Big data technology. • Strong object-oriented design and analysis skills • Excellent written and verbal communication skills Responsibilities • Utilize your software engineering skills including Spark, Python, Scala to Analyze disparate, complex systems and collaboratively design new products and services • Integrate new data sources and tools • Implement scalable and reliable distributed data replication strategies • Collaborate with other teams to design and develop and deploy data tools that support both operations and product use cases • Perform analysis of large data sets using components from the Hadoop ecosystem • Own product features from the development, testing through to production deployment • Evaluate big data technologies and prototype solutions to improve our data processing architecture • Automate different pipelines Show more Show less
Posted 2 days ago
7.0 - 20.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS!! TCS is Hiring for Data Solution Architect Interview Mode: Virtual Required Experience: 7-20 years Work location: PAN INDIA Must have: Design and manage ETL pipelines on cloud platforms (preference for AWS) Utilize tools like Airflow to orchestrate tasks and Cloud Watch to manage notifications Collaborate with cross-functional teams to enhance data-driven decision making, ensuring alignment of machine learning projects with our strategic business goals Develop containerized applications to improve data accuracy, accessibility, and reliability through customized Python solutions Contribute to data governance workflow, improving data quality, standardizing definitions, training, and onboarding data stewards to own their teams’ KPIs Pilot next-generation technologies solving problems traditional programming struggles with by utilizing Gen AI or other machine learning techniques Key Skills/Knowledge: Excellent hands-on knowledge on Python Proven experience in data engineering, with a strong focus on machine learning operations Proficiency in developing ETL pipelines and architecting big data solutions Expertise in one cloud platform and a proven record of working on at least one project end to end Strong collaboration skills, with the ability to work effectively across diverse teams A passion for innovation, driven by a desire to push the boundaries of what's possible with technology in an ambiguous environment If interested kindly send your updated CV and below mentioned details through DM/E-mail: srishti.g2@tcs.com Name: E-mail ID: Contact Number: Highest qualification(Fulltime): Preferred Location: Highest qualification university: Current organization: Total, years of experience: Relevant years of experience: Any gap: Mention-No: of months/years (career/ education): If any then reason for gap: Is it rebegin: Previous organization name: Current CTC: Expected CTC: Notice Period: Have you worked with TCS before (Permanent / Contract): Show more Show less
Posted 2 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title :Data Engineer Key Skills :Python , ETL, Snowflake, Apache Airflow Job Locations : Pan India. Experience : 6-7 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job Description: 6 to 10 years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka , AWS SQS , or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment , collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink . Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
India
Remote
Job Title- DevOps Engineer Exp-4+ Years Job Type : Remote Shift Timings : 4PM – 12PM Skills required: • Experience in infrastructure, DevOps, or SRE roles with increasing responsibility • Experience with Kubernetes, Terraform, containerizaAon, and at least one major cloud provider (AWS preferred) • Strong knowledge of system design, networking, and reliability principles • Experience with observability tools (e.g., Prometheus, Grafana, Datadog) and incident response pracAces • Web applicaAon building with databases, APIs, Kubernetes, authenAcaAon Nice to have: • Experience supporAng data pipelines, ML workloads, or complex orchestraAon systems • Familiarity with the data/ML tooling ecosystem (e.g., Airflow, dbt, Spark, Dremio, etc.) • Previous experience in a startup or high-growth environment Show more Show less
Posted 2 days ago
0 years
0 Lacs
India
Remote
Required Skills: YOE-8+ Mode Of work: Remote Design, develop, modify, and test software applications for the healthcare industry in agile environment. Duties include: Develop. support/maintain and deploy software to support a variety of business needs Provide technical leadership in the design, development, testing, deployment and maintenance of software solutions Design and implement platform and application security for applications Perform advanced query analysis and performance troubleshooting Coordinate with senior-level stakeholders to ensure the development of innovative software solutions to complex technical and creative issues Re-design software applications to improve maintenance cost, testing functionality, platform independence and performance Manage user stories and project commitments in an agile framework to rapidly deliver value to customers deploy and operate software solutions using DevOps model. Required skills: Azure Deltalake, ADF, Databricks, PySpark, Oozie, Airflow, Big Data technologies( HBASE, HIVE), CI/CD (GitHub/Jenkins) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Aramya, we’re redefining fashion for India’s underserved Gen X/Y women, offering size-inclusive, comfortable, and stylish ethnic wear at affordable prices. Launched in 2024, we’ve already achieved ₹40 Cr in revenue in our first year, driven by a unique blend of data-driven design, in-house manufacturing, and a proprietary supply chain. Today, with an ARR of ₹100 Cr, we’re scaling rapidly with ambitious growth plans for the future. Our vision is bold to build the most loved fashion and lifestyle brands across the world while empowering individuals to express themselves effortlessly. Backed by marquee investors like Accel and Z47, we’re on a mission to make high-quality ethnic wear accessible to every woman. We’ve built a community of loyal customers who love our weekly design launches, impeccable quality, and value-for-money offerings. With a fast-moving team driven by creativity, technology, and customer obsession, Aramya is more than a fashion brand—it’s a movement to celebrate every woman’s unique journey. We’re looking for a passionate Data Engineer with a strong foundation. The ideal candidate should have a solid understanding of D2C or e-commerce platforms and be able to work across the stack to build high-performing, user-centric digital experiences. Key Responsibilities Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Databricks , and Spark . Own and manage data lakes/warehouses on AWS Redshift (or Snowflake/BigQuery). Optimize SQL queries and data models for analytics, performance, and reliability. Develop and maintain backend APIs using Python (FastAPI/Django/Flask) or Node.js . Integrate external data sources (APIs, SFTP, third-party connectors) and ensure data quality & validation. Implement monitoring, logging, and alerting for data pipeline health. Collaborate with stakeholders to gather requirements and define data contracts. Maintain infrastructure-as-code (Terraform/CDK) for data workflows and services. Must-Have Skills Strong in SQL and data modeling (OLTP and OLAP). Solid programming experience in Python , preferably for both ETL and backend. Hands-on experience with Databricks , Redshift , or Spark . Experience with building and managing ETL pipelines using tools like Airflow , dbt , or similar. Deep understanding of REST APIs , microservices architecture, and backend design patterns. Familiarity with Docker , Git, CI/CD pipelines. Good grasp of cloud platforms (preferably AWS ) and services like S3, Lambda, ECS/Fargate, CloudWatch. Nice-to-Have Skills Exposure to streaming platforms like Kafka, Kinesis, or Flink. Experience with Snowflake , BigQuery , or Delta Lake . Proficient in data governance , security best practices, and PII handling. Familiarity with GraphQL , gRPC , or event-driven systems. Knowledge of data observability tools (Monte Carlo, Great Expectations, Datafold). Experience working in a D2C/eCommerce or analytics-heavy product environment. Show more Show less
Posted 2 days ago
10.0 - 15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position- Cloudera Data Engineer Location- Chennai Notice period- 0-30 Days/ Immediately Joiners Experience-10 to 15 years Cloudera Data Engineer will likely focus on designing, building, and maintaining scalable data pipelines and platforms within the Cloudera Hadoop ecosystem. Key skills include expertise in data warehousing, ETL processes, and strong programming abilities in languages like Python and SQL. They'll also need to be proficient in Cloudera tools, including Spark, Hive, and potentially Airflow for orchestration. Thank you Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
India
On-site
Job Title: Azure Databricks Engineer Experience: 4+ Years Required Skills: 4+ years of experience in Data Engineering . Strong hands-on experience with Azure Databricks and PySpark . Good understanding of Azure Data Factory (ADF) , Azure Data Lake (ADLS) , and Azure Synapse . Strong SQL skills and experience with large-scale data processing. Experience with version control systems (Git), CI/CD pipelines, and Agile methodology. Knowledge of Delta Lake, Lakehouse architecture, and distributed computing concepts. Preferred Skills: Experience with Airflow , Power BI , or machine learning pipelines . Familiarity with DevOps tools for automation and deployment in Azure. Azure certifications (e.g., DP-203) are a plus. Show more Show less
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida, Uttar Pradesh, India Business Intelligence BOLD is seeking for QA professional who will work directly with the BI Development team to validate Business Intelligence solutions. He will build test strategy and test plans and test cases for ETL and Business Intelligence components. He will also validate SQL queries related to test cases and produce test summary reports. Job Description ABOUT THIS TEAM BOLD Business Intelligence(BI) team is a centralized team responsible for managing all aspects of the organization's BI strategy, projects and systems. BI team enables business leaders to make data-driven decisions by providing reports and analysis. The team is responsible for developing and manage a latency-free credible enterprise data warehouse which is a data source for decision making and input to various functions of the organization like Product, Finance, Marketing, Customer Support etc. BI team has four sub-components as Data analysis, ETL, Data Visualization and QA. It manages deliveries through Snowflake, Sisense and Microstrategy as main infrastructure solutions. Other technologies including Python, R, Airflow are also used in ETL, QA and data visualizations. WHAT YOU’LL DO Work with Business Analysts, BI Developers to translate Business requirements into Test Cases Responsible for validating the data sources, extraction of data, applying transformation logic, and loading the data in the target tables. Designing, documenting and executing test plans, test harness, test scenarios/scripts & test cases for manual, automated & bug tracking tools. WHAT YOU’LL NEED Experience in Data Warehousing / BI Testing, using any ETL and Reporting Tool Extensive experience in writing and troubleshooting SQL Queries using any of the Databases – Snowflake/ Redshift / SQL Server / Oracle Exposure to Data Warehousing and Dimensional Modelling Concepts Experience in understanding of ETL Source to Target Mapping Document Experience in testing the code on any of the ETL Tools Experience in Validating the Dashboards / Reports on any of the Reporting tools – Sisense / Tableau / SAP Business Objects / MicroStrategy Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC). Good experience of Quality Assurance methodologies like Waterfall, V-Model, Agile, Scrum. Well versed with writing detailed test cases for functional & non-functional requirements. Experience on different types of testing that includes Black Box testing, Smoke testing, Functional testing, System Integration testing, End-to-End Testing, Regression testing & User Acceptance testing (UAT) & Involved in Load Testing, Performance Testing & Stress Testing. · Expertise in using TFS / JIRA / Excel for writing the Test Cases and tracking the Exposure in scripting languages like Python to create automated Test Scripts or Automated tools like Query Surge will be an added advantage An effective communicator with strong analytical abilities combined with skills to plan, implement & presentation of projects EXPERIENCE- Senior QA Engineer, BI: 4.5 years+ #LI-SV1 Benefits Outstanding Compensation Competitive salary Tax-friendly compensation structure Bi-annual bonus Annual Appraisal Equity in company 100% Full Health Benefits Group Mediclaim, personal accident, & term life insurance Group Mediclaim benefit (including parents' coverage) Practo Plus health membership for employees and family Personal accident and term life insurance coverage Flexible Time Away 24 days paid leaves Declared fixed holidays Paternity and maternity leave Compassionate and marriage leave Covid leave (up to 7 days) Additional Benefits Internet and home office reimbursement In-office catered lunch, meals, and snacks Certification policy Cab pick-up and drop-off facility About BOLD We Transform Work Lives As an established global organization, BOLD helps people find jobs. Our story is one of growth, success, and professional fulfillment. We create digital products that have empowered millions of people in 180 countries to build stronger resumes, cover letters, and CVs. The result of our work helps people interview confidently, finding the right job in less time. Our employees are experts, learners, contributors, and creatives. We Celebrate And Promote Diversity And Inclusion We value our position as an Equal Opportunity Employer. We hire based on qualifications, merit, and our business needs. We don't discriminate regarding race, color, religion, gender, pregnancy, national origin or citizenship, ancestry, age, physical or mental disability, veteran status, sexual orientation, gender identity or expression, marital status, genetic information, or any other applicable characteristic protected by law.
Posted 2 days ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Title: Senior Machine Learning Engineer Location: On Site / [Gurgaon, India] Experience: 5+ Years Type: Full-time / Contract About the Role We are looking for an experienced Machine Learning Engineer with a strong background in building, deploying, and scaling ML models in production environments. You will work closely with Data Scientists, Engineers, and Product teams to translate business challenges into data-driven solutions and build robust, scalable ML pipelines. This is a hands-on role requiring a blend of applied machine learning, data engineering, and software development skills. Key Responsibilities Design, build, and deploy machine learning models to solve real-world business problems Work on the end-to-end ML lifecycle: data preprocessing, feature engineering, model selection, training, evaluation, deployment, and monitoring Collaborate with cross-functional teams to identify opportunities for machine learning across products and workflows Develop and optimize scalable data pipelines to support model development and inference Implement model retraining, versioning, and performance tracking in production Ensure models are interpretable, explainable, and aligned with fairness, ethics, and compliance standards Continuously evaluate new ML techniques and tools to improve accuracy and efficiency Document processes, experiments, and findings for reproducibility and team knowledge-sharing Requirements 5+ years of hands-on experience in machine learning, applied data science, or related roles Strong foundation in ML algorithms (regression, classification, clustering, NLP, time series, etc.) Experience with production-level ML deployment using tools like MLflow, Kubeflow, Airflow, FastAPI , or similar Proficiency in Python and libraries like scikit-learn, TensorFlow, PyTorch, XGBoost, pandas, NumPy Experience with cloud platforms (AWS, GCP, or Azure) and containerized environments (Docker, Kubernetes) Strong understanding of software engineering principles and experience with Git, CI/CD, and version control Experience with large datasets, distributed systems (Spark/Databricks), and SQL/NoSQL databases Excellent problem-solving, communication, and collaboration skills Nice to Have Experience with LLMs, Generative AI , or transformer-based models Familiarity with MLOps best practices and infrastructure as code (e.g., Terraform) Experience working in regulated industries (e.g., finance, healthcare) Contributions to open-source projects or ML research papers Why Join Us Work on impactful problems with cutting-edge ML technologies Collaborate with a diverse, expert team across engineering, data, and product Flexible working hours and remote-first culture Opportunities for continuous learning, mentorship, and growth Show more Show less
Posted 3 days ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. * Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data •Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL, Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 6 to 9 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 days ago
8.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Company Description PSI is India’s largest venture studio in AI and Deep Tech, headquartered in New Delhi. We’re committed to building science-backed, tech-enabled ventures that address high-impact, emerging market opportunities. We operate at the intersection of innovation, research, and business execution, launching bold, founder-led companies from the ground up. We’re currently building one of our stealth ventures, a next-gen media-tech company that uses AI to supercharge storytelling, public engagement, and digital performance at scale. As part of the founding execution team, we are hiring a Program Manager – Technical who can lead the orchestration of our AI-powered campaign stack from tools to data pipelines to automation. Role Description This role is for a hands-on technical program manager with strong experience in data workflows, AI tools, and marketing automation . You will own the delivery of complex digital campaign infrastructure ensuring that AI-driven solutions are deployed reliably, tracking is accurate, platforms talk to each other, and every campaign runs with full technical clarity. You’ll work with creative, strategy, content, and media teams but your job is to make the stack work, scale, and evolve . Key Responsibilities Campaign Infrastructure Setup: Lead the tech setup of digital campaigns landing pages, event tracking, automation flows, attribution logic Tool Integration: Own the configuration and integration of marketing tools (CRM, email automation, ad tech, AI platforms, analytics) Data Pipeline Management: Ensure data flow across systems is clean, validated, and usable from ad platforms to CRMs to dashboards AI-Driven Marketing Execution: Coordinate the deployment of AI tools (e.g., audience segmentation, chatbot automation, content engines) within active campaigns Automation Workflows: Build and monitor automated campaigns using tools like Zapier, Make, HubSpot, or custom scripts Technical QA: Own quality assurance for all campaign tech — links, tags, load times, UTM structures, lead routing, error handling Cross-Team Collaboration: Work with analytics, strategy, content, and media buying teams to ensure campaigns have the technical foundation they need to succeed Troubleshooting & Debugging: Act as first-line tech responder when something breaks and proactively prevent it from happening again Skills & Qualifications 8+ years of experience in technical project management or campaign technology roles Proven experience in MarTech or AdTech environments — setting up, managing, and scaling digital campaign stacks Hands-on experience with tools like GA4, Tag Manager, HubSpot, Segment, Zapier, Airflow, Data Studio, Looker Familiarity with scripting and query languages: JavaScript, Python (basic), SQL Deep understanding of campaign-level data analytics , attribution modeling, and CRM-integrated funnels Experience working with AI-powered tools for automation, content generation, audience prediction, or personalization Comfort in working with APIs, webhooks, integrations, and marketing systems Bonus: Experience with low-code/no-code platforms and building internal utilities Why Join PSI Work on real, AI-first use cases in media-tech and public engagement Be the technical architect behind some of the most watched campaigns of this decade Access to India’s most advanced campaign tooling, internal AI stack, and creative intelligence networks Flat hierarchy, founder-led culture, and a team that genuinely values competence Show more Show less
Posted 3 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About Nourma We're building the AI-powered finance operating system that transforms how companies manage their financial operations. Our Decision Intelligence platform combines LLMs, multi-agent systems, and real-time data integration to create an intelligent finance team The Role We're seeking an AI/ML Engineer with deep expertise in LangChain, LlamaIndex, PydanticAI, and modern Python frameworks to architect and build the core intelligence layer of Nourma. Key Responsibilities LLM Orchestration & RAG Development (LangChain/LlamaIndex/PydanticAI Focus) Architect complex LangChain pipelines for multi-agent financial workflows Build production RAG systems using LlamaIndex for financial document retrieval Implement agents with strong type safety and structured outputs Design and implement: Chain-of-thought reasoning for financial analysis Dynamic prompt routing based on query complexity Memory management for long-running financial conversations Tool integration for agents to access GL, bank feeds, and operational data Optimise token usage and response latency for real-time WhatsApp interactions API Development & Integration (FastAPI Focus) Build high-performance FastAPI services for: Agent-to-agent communication protocols WhatsApp webhook processing with sub-second response Real-time financial data APIs for frontend consumption Design GraphQL schemas for flexible financial data queries Implement WebSocket connections for live financial updates Create robust error handling and retry mechanisms for financial integrations Vector Database & Semantic Search (Chroma Focus) Design and optimise Chroma collections for: Financial document embeddings (loan agreements, invoices) Conversation history and context retrieval Business logic and rule storage Implement hybrid search combining vector similarity and metadata filtering Build embedding pipelines for various document types (PDFs, emails, chat logs) Infrastructure & Scalability Deploy and manage LLM applications. Implement Redis caching strategies for LLM responses and financial data Design microservices architecture for agent deployment Set up monitoring and observability for AI pipelines Technical Requirements Must Have - Core Technologies Expert-level proficiency in: LangChain : Custom chains, agents, tools, memory systems LlamaIndex : Document stores, indices, query engines PydanticAI : Agent frameworks, type-safe LLM interactions, structured outputs FastAPI : Async programming, dependency injection, middleware Strong experience with Python async/await patterns Production experience with Chroma or similar vector databases Proficiency with Redis for caching and session management Experience with data pipeline and storage tools (Kafka, Spark, Airflow) for building scalable systems Nice to Have Knowledge of PostgreSQL and BigQuery for analytical workloadsUnderstanding of financial data structures (journal entries, chart of accounts) Experience with financial APIs (QuickBooks, Xero, Plaid, banking APIs) Knowledge of data consistency requirements for financial systems GraphQL schema design and optimisation Experience with WhatsApp Business API Background in fintech or accounting software Tech Stack LLMs : GPT-4, Claude, open-source models ML/AI : LangChain, LlamaIndex, PydanticAI, PyTorch, Transformers Vector DB : Chroma Data : PostgreSQL, BigQuery, Apache Kafka, Spark, Airflow APIs : FastAPI, GraphQL Infrastructure : AWS/GCP, Kubernetes, Docker, Redis Monitoring : Prometheus, Grafana, OpenTelemetry What We Offer Work on cutting-edge problems combining LLMs with real-time financial data Build systems processing millions of financial transactions Direct impact on how thousands of companies manage finances Work directly with founders and shape technical direction Ideal Candidate Profile You're excited about: Building production LangChain and PydanticAI applications at scale Creating high-performance APIs that power AI agents Designing scalable architectures for financial data processing Working with cutting-edge LLM technologies You've probably: Built production LangChain/LlamaIndex/PydanticAI applications serving 1000+ users Created FastAPI services handling high-throughput LLM requests Worked with vector databases in production environments Designed data processing pipelines for financial or similar domains Contributed to open-source AI/ML projects Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
India
On-site
About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Engineering Business Unit Overview The charter for Engineering group at Oportun is to be the world-class engineering force behind our innovative products. The group plays a vital role in designing, developing, and maintaining cutting-edge software solutions that power our mission and advance) our business. We strike a balance between leveraging leading tools and developing in-house solutions to create member experiences that empower their financial independence. The talented engineers in this group are dedicated to delivering and maintaining performant, elegant, and intuitive systems to our business partners and retail members. Our platform combines service-oriented platform features with sophisticated user experience and is enabled through a best-in-class (and fun to use!) automated development infrastructure. We prove that FinTech is more fun, more challenging, and in our case, more rewarding as we build technology that changes our members’ lives. Engineering at Oportun is responsible for high quality and scalable technical execution to achieve business goals and product vision. They ensure business continuity to members by effectively managing systems and services - overseeing technical architectures and system health. In addition, they are responsible for identifying and executing on the technical roadmap that enables product vision as well as fosters member & business growth in a scalable and efficient manner. The Enterprise Data and Technology (EDT) pillar within the Engineering Business Unit focusses on enabling wide use of corporate data assets whilst ensuring quality, availability and security across the data landscape. Position Overview As a Senior Data Engineer at Oportun, you will be a key member of our EDT team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross functional and multi-month long projects). Responsibilities Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise and define optimal data models and structures. Data Pipeline Development and Optimization: Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management and Optimization: Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality and Governance: Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship and Leadership: Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration and Stakeholder Management: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring and Optimization: Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Software Engineering Requirements You actively contribute to the end-to-end delivery of complex software applications, ensuring adherence to best practices and high overall quality standards. You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective software solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/Pyspark and Java /Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MySQL, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins and Airflow. Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3). Show more Show less
Posted 3 days ago
8.0 years
6 - 9 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Software Engineer The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering and distribution to power our Financial Institution Group business and customer needs. Focus on platform scale to support business by following a common data lifecycle that accelerates business value. We provide essential intelligence for Financial Services, Real Estate and Insurance industries The Impact: The FIG Data Engineering team will be responsible for implementing & maintaining services and/or tools to support existing feed systems which allows users to consume FIG datasets and make FIG data available to a data fabric for wider consumption and processing within the company. What’s in it for you: Ability to work with global stakeholders and working on latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using big data and cloud technologies. Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative. What We’re Looking For: Bachelor’s in computer science or equivalent with at least 8+ years of professional software work experience Experience with Big Data platforms such as Apache Spark, Apache Airflow, Google Cloud Platform, Apache Hadoop. Deep understanding of REST, good API design, and OOP principles Experience with object-oriented/object function scripting languages : Python , C#, Scala, etc. Good working knowledge of relational SQL and NoSQL databases Experience in maintaining and developing software in production utilizing cloud-based tooling ( AWS, Docker & Kubernetes, Okta) Strong collaboration and teamwork skills with excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced software development environment Agile experience highly desirable Experience in Snowflake, Databricks will be a big plus . Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316183 Posted On: 2025-06-15 Location: Hyderabad, Telangana, India
Posted 3 days ago
5.0 years
0 Lacs
India
On-site
Hiring: Software Support Engineer (ETL & Healthcare Data) Location: Hyderabad Experience: 5+ Years | Full-Time We are looking for a highly skilled Software Support Engineer with strong experience in supporting and troubleshooting ETL pipelines, production issues, and healthcare data systems. If you're passionate about optimizing systems, solving technical challenges, and collaborating with cross-functional teams — we’d love to connect! Key Responsibilities: Provide L2/L3 support for ETL jobs and data pipelines Troubleshoot production issues and perform root cause analysis Automate and monitor workflows using Python, Shell scripting, Talend, and Airflow Collaborate with developers, analysts, and QA for issue resolution Document processes, job flows, and deployment steps Support healthcare data transactions (EDI 834, 837, 999, etc.) Tech Stack & Skills: Python, C#, Shell scripting SQL & NoSQL (MongoDB) Airflow, Talend Studio, Redix Azure Service Bus / Kafka CI/CD (Azure Pipelines), Git/Bitbucket JIRA, Agile methodology Nice to Have: Knowledge of Facets, HealthRules Payor Experience with cloud environments Exposure to monitoring tools Job Type: Full-time Shift: Evening shift Work Days: Monday to Friday Work Location: In person
Posted 3 days ago
8.0 years
0 Lacs
Bengaluru
On-site
Overview: Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities: Team: Core Engineering Reliability Team Collaborate with engineering and TPM leaders, developers, and process engineers to create data solutions that extract actionable insights from incident and post-incident management data, supporting objectives of incident prevention and reducing detection, mitigation, and communication times. Work with diverse stakeholders to understand their needs and design data models, acquisition processes, and applications that meet those requirements. Add new sources, implement business rules, and generate metrics to empower product analysts and data scientists. Serve as the data domain expert, mastering the details of our incident management infrastructure. Take full ownership of problems from ambiguous requirements through rapid iterations. Enhance data quality by leveraging and refining internal tools and frameworks to automatically detect issues. Cultivate strong relationships between teams that produce data and those that build insights. Qualifications: Minimum Qualifications / Your background: BS in Computer Science or equivalent experience with 8+ years as a Senior Data Engineer or similar role 10+ Years of progressive experience in building scalable datasets and reliable data engineering practices. Proficiency in Python, SQL, and data platforms like DataBricks Proficiency in relational databases and query authoring (SQL). Demonstrable expertise designing data models for optimal storage and retrieval to meet product and business requirements. Experience building and scaling experimentation practices, statistical methods, and tools in a large scale organization Excellence in building scalable data pipelines using Spark (SparkSQL) with Airflow scheduler/executor framework or similar scheduling tools. Expert experience working with AWS data services or similar Apache projects (Spark, Flink, Hive, and Kafka). Understanding of Data Engineering tools/frameworks and standards to improve the productivity and quality of output for Data Engineers across the team. Well versed in modern software development practices (Agile, TDD, CICD) Desirable Qualifications Demonstrated ability to design and operate data infrastructure that deliver high reliability for our customers. Familiarity working with datasets like Monitoring, Observability, Performance, etc.. Benefits & Perks Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh .
Posted 3 days ago
0 years
0 Lacs
Chennai
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Job Description- GCP Cloud Architecture. Model Deployment Lifecycle Knowledge of creating Training & Serving Pipeline Familiar with any one of workflow: Kubeflow, Airflow, ML Flow, Argo etc" Strong in Python Adequate SQL skill Must have skill : Python, SQL, ML Engineer (Model Deployment/MLOPS), ML Pipeline-(Kubeflow, Airflow Flow, Argo etc,) Preferred Skill: Pytorch, TensorFlow, Exp in hiper scaler/Cloud Service, Deep learning framework, Reinvent your world.¿We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 3 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Ethos Ethos was built to make it faster and easier to get life insurance for the next million families. Our approach blends industry expertise, technology, and the human touch to find you the right policy to protect your loved ones. We leverage deep technology and data science to streamline the life insurance process, making it more accessible and convenient. Using predictive analytics, we are able to transform a traditionally multi-week process into a modern digital experience for our users that can take just minutes! We’ve issued billions in coverage each month and eliminated the traditional barriers, ushering the industry into the modern age. Our full-stack technology platform is the backbone of family financial health. We make getting life insurance easier, faster and better for everyone. Our investors include General Catalyst, Sequoia Capital, Accel Partners, Google Ventures, SoftBank, and the investment vehicles of Jay-Z, Kevin Durant, Robert Downey Jr and others. This year, we were named on CB Insights' Global Insurtech 50 list and BuiltIn's Top 100 Midsize Companies in San Francisco. We are scaling quickly and looking for passionate people to protect the next million families! About The Role Ethos is seeking a Senior Data Analyst Engineer to join our Data Platform team. In this role, you will play a critical part in transforming raw data into actionable insights that drive business growth. If you have a passion for data analysis, data modelling and warehouse development, and are proficient in SQL, DBT & data modelling, we encourage you to apply. Duties And Responsibilities Build and maintain data marts to support various business functions Work closely with cross functional stakeholders to identify and build a roadmap for new data model development. Collaborate with application and product teams to understand product and application behavior to arrive at the appropriate data model. Build and maintain data marts to support various business functions and carrier partners. Optimize queries, and refine data structures to ensure efficient data retrieval and reporting. Ensure data accuracy, consistency, and integrity by implementing data validation and data cleansing processes. Perform data quality checks and troubleshoot any issues. Set standards for data model development, establish and evangelize best practices within the team and organization wide. Maintain detailed documentation of data models for reference and knowledge sharing. As a senior member of the team, you will also be responsible for the overall technical excellence of the data marts, lead large projects working with multiple stakeholders and mentor the team on technical aspects. Qualifications And Skills 6+ years of proven experience in Data Analytics, Data Engineering or a related role Proven experience in data analytics, data engineering, or a related role Strong proficiency in SQL and DBT for data model development is a must Experience with data integration tools and platforms, including Airflow Familiarity with Mode or similar data visualization tools Knowledge of tools like Segment, Amplitude & Iterable is a plus Don’t meet every single requirement? If you’re excited about this role but your past experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyway. At Ethos we are dedicated to building a diverse, inclusive and authentic workplace. We are an equal opportunity employer who values diversity and inclusion and look for applicants who understand, embrace and thrive in a multicultural world. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the SF Fair Chance Ordinance, we will consider employment for qualified applicants with arrests and conviction records. To learn more about what information we collect and how it may be used, please refer to our California Candidate Privacy Notice. Show more Show less
Posted 3 days ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Morgan Stanley Senior Platform Engineer - Vice President - Software Production Management & Reliability Engineering Profile Description We’re seeking someone to join our team as (Vice President) Systems Engineer’s role for stability, integrity, and efficient operation of the in-house and 3rd party systems that support core organizational functions. This is achieved by monitoring, maintaining, supporting, and optimizing all networked software and associated operating systems. The Systems Engineer will apply proven communication, analytical, and problem-solving skills to help identify, communicate, and resolve issues in order to maximize the benefit of IT systems investments. This individual will also mentor and provide guidance to the Systems Engineer staff. Investment_Management In the Investment Management division, we deliver active investment strategies across public and private markets and custom solutions to institutional and individual investors. This is Vice President position that oversees the production environment, ensuring the operational reliability of deployed software, and implements strategies to optimize performance and minimize downtime. Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses for those who show passion and grit in their work. Interested in joining a team that’s eager to create, innovate and make an impact on the world? Read on… What You’ll Do In The Role Designing: Responsible for designing and implementing the overall IM Technology platform architecture, ensuring seamless operation and alignment with the organization’s goals, scalability requirements, and industry best practices. Technology Evaluation: Drive innovation by evaluating and selecting appropriate technologies, frameworks, and tools which will maximize the value produced by the IM Technology platform. Guidance and Leadership: Providing technical guidance and leadership to IM Technology team developers throughout the development lifecycle. Liaise between IM Technology groups and End User Technology to triage environmental issues and resolve before they reach production. Scalability and Performance: Ensuring that the platform architecture can scale efficiently and meet performance requirements. Security: Implementing security best practices for infrastructure, such as network security, access control, and data encryption and also ensuring that the platform architecture is resilient to security threats. Working closely with the Security Architecture team to keep our infrastructure aligned with frequent changes to firm-level security policies such as network security, access control and data encryption. Integration: Overseeing the integration of various components and systems within the platform. Also act as a conduit for secure data transfer between critical platform applications and third-party data providers or receivers. Collaborate: Collaborating with product owners, business stakeholders and ITSO’s to ensure the system's architecture supports the organization's goals and objectives. Testing: Writing and executing tests to ensure the reliability, scalability, and performance of the platform. Deployment: Managing the deployment process and ensuring smooth deployments of platform updates and releases. Monitoring and Maintenance: Monitoring the platform for performance issues, bugs, and security vulnerabilities, and addressing them promptly. This includes performing routine preventative maintenance such as system patching, updates and upgrades. Automation: Implementing automation tools and processes to streamline development, deployment, and maintenance tasks. Documentation: Creating and maintaining technical documentation for the platform components and processes. Infrastructure as Code: Managing infrastructure using code-based tools like Terraform or CloudFormation in order to ensure simplicity of the platform, minimization of errors and adherence to Change Management principles. Containerization and Orchestration: Implementing containerization using Docker and container orchestration using Kubernetes or similar tools. Monitoring and Logging: Setting up monitoring and logging solutions to track the performance and health of the platform. This involves ensuring that any logs generated throughout the platform are tracked in firm-approved systems and are secured according to their level of confidentiality. What You’ll Bring To The Role At least 8 years' relevant experience would generally be expected to find the skills required for this role Good working experience in at least some of the below technologies: Middleware (i.e. Tomcat, WebSphere, WebLogic) App containerization (i.e. Kubernetes, Redhat OpenShift) Automation (i.e. UiPath RPA, Microsoft Power, Airflow) Message Queue (i.e. Kafka, MQ) ETL (i.e. Glide) Analytics (i.e. Dataiku) Data Management (i.e Snowflake, Data Bricks) Should have sound knowledge on IT Application architecture, Design methodologies across multiple platforms. Good understanding of applications capacity management Good experience on applications resilience planning Good working knowledge on SQL and scripting. Sound knowledge on multiple operating systems. Flexibility on off-hours and weekends availability. Excellent understanding of the organization’s goals and objectives. Good project management skills. Excellent written, oral, and interpersonal communication skills. Proven analytical and creative problem-solving abilities. Able to prioritize and execute tasks in a high-pressure environment. Ability to work in a team-oriented, collaborative environment. What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents. Show more Show less
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.