Jobs
Interviews

259 Data Pipelines Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for the position of Python Developer at Exela should possess solid programming experience and a basic understanding of AI concepts. In this role, you will be responsible for designing, developing, testing, and maintaining Python applications and services. Additionally, you will work on REST APIs, data pipelines, automation scripts, LLM integration, LangGraph, LangChain, preprocessing, and reporting. Key Responsibilities: - Design, develop, test, and maintain Python applications and services - Work with REST APIs, data pipelines, and automation scripts - Work on LLM integration, LangGraph, LangChain, preprocessing, and reporting - Write clean, maintainable, and efficient code - Conduct unit testing and code reviews - Optimize performance and troubleshoot production issues Qualifications: - Strong proficiency in Python (OOP, standard libraries, file I/O, error handling) - Experience with frameworks like Flask, FastAPI, or Django - Basic understanding of pandas, NumPy, and data manipulation - Familiarity with Git and version control best practices - Experience working with JSON, CSV, and APIs If you are a skilled Python Developer looking to work on robust, scalable Python-based Agentic AI applications and support data teams with integration and automation tasks, we encourage you to apply for this exciting opportunity at Exela.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

FICO is a leading global analytics software company that assists businesses in over 100 countries in making informed decisions. By joining the world-class team at FICO, you will have the opportunity to realize your career potential. As a part of the product development team, you will play a crucial role in providing thought leadership and driving innovation. This position involves collaborating closely with product management to architect, design, and develop a highly feature-rich product as the VP, Software Engineering. Your responsibilities will include designing, developing, testing, deploying, and supporting the capabilities of a large enterprise-level platform. You will create scalable microservices with a focus on high performance, availability, interoperability, and reliability. Additionally, you will contribute to technical designs, participate in defining technical acceptance criteria, and mentor junior engineers to uphold quality standards. To be successful in this role, you should hold a Bachelor's or Master's degree in computer science or a related field and possess a minimum of 7 years of experience in software architecture, design, development, and testing. Expertise in Java, Spring, Spring Boot, Maven/Gradle, Docker, Git, GitHub, as well as experience with data structures, algorithms, and system design is essential. Furthermore, you should have a strong understanding of microservices architecture, RESTful and gRPC APIs, cloud engineering technologies such as Kubernetes and AWS/Azure/GCP, and databases like MySQL, PostgreSQL, MongoDB, and Cassandra. Experience with Agile software development, data engineering services, and software design principles is highly desirable. At FICO, you will have the opportunity to work in an inclusive culture that values core principles like acting like an owner, delighting customers, and earning respect. You will benefit from competitive compensation, benefits, and rewards programs while enjoying a people-first work environment that promotes work/life balance and professional development. Join FICO and be part of a leading organization at the forefront of Big Data analytics, where you can contribute to helping businesses leverage data to enhance decision-making processes. Your role at FICO will make a significant impact on global businesses, and you will be part of a diverse and inclusive environment that fosters collaboration and innovation.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your primary objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process, including analysis, design, construction, testing, and implementation. Additionally, you will provide user and operational support on applications to business users. You will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. Furthermore, you will recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. You will also consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems. As the Applications Development Senior Programmer Analyst, you will ensure that essential procedures are followed, help define operating standards and processes, and serve as an advisor or coach to new or lower-level analysts. You will have the ability to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. In this role, you will appropriately assess risk when business decisions are made, demonstrate particular consideration for the firm's reputation, and safeguard Citigroup, its clients, and assets by driving compliance with applicable laws, rules, and regulations. You will be required to have strong analytical and communication skills and must be results-oriented, willing, and able to take ownership of engagements. Additionally, experience in the banking domain is a must. Qualifications: Must Have: - 8+ years of application/software development/maintenance - 5+ years of experience on Big Data Technologies like Apache Spark, Hive, Hadoop - Knowledge of Python, Java, or Scala programming language - Experience with JAVA, Web services, XML, Java Script, Micro services, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem - Ability to work independently, multi-task, and take ownership of various analyses or reviews Good to Have: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience on cloud technologies, AI/ML integration, and creation of data pipelines - Experience with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and use of data formats Education: - Bachelor's degree/University degree or equivalent experience This is a high-level overview of the job responsibilities and qualifications. Other job-related duties may be assigned as required.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You are an experienced Solution Architect with a solid background in software architecture and a good understanding of AI-based products and platforms. Your main responsibility will be to design robust, scalable, and secure architectures that support AI-driven applications and enterprise systems. In this role, you will collaborate closely with cross-functional teams, including data scientists, product managers, and engineering leads, to ensure the alignment of business needs, technical feasibility, and AI capabilities. Your key responsibilities will include architecting end-to-end solutions for enterprise and product-driven platforms, such as data pipelines, APIs, AI model integration, cloud infrastructure, and user interfaces. You will guide teams in selecting appropriate technologies, tools, and design patterns for building scalable systems. Additionally, you will work with AI/ML teams to understand model requirements and facilitate their smooth deployment and integration into production environments. In this role, you will define system architecture diagrams, data flow, service orchestration, and infrastructure provisioning using modern tools. Furthermore, you will collaborate with stakeholders to translate business requirements into technical solutions, emphasizing scalability, performance, and security. Your leadership will be crucial in promoting best practices for software development, DevOps, and cloud-native architecture. You will also conduct architecture reviews to ensure compliance with security, performance, and regulatory standards. To be successful in this role, you should have at least 10 years of experience in software architecture or solution design roles. You should demonstrate expertise in designing systems using microservices, RESTful APIs, event-driven architecture, and cloud-native technologies. Hands-on experience with major cloud providers like AWS, GCP, or Azure is essential. Familiarity with AI/ML platforms and components, data architectures, containerization, DevOps principles, and the ability to lead technical discussions are also required skills. Preferred qualifications include exposure to AI model lifecycle management, infrastructure-as-code tools like Terraform or Pulumi, knowledge of GraphQL, gRPC, or serverless architectures, and previous experience in AI-driven product companies or digital transformation programs. In return, you will have the opportunity to play a high-impact role in designing intelligent systems that drive the future of AI adoption. You will work alongside forward-thinking engineers, researchers, and innovators, with a strong focus on career growth, learning, and technical leadership. The compensation offered is competitive and reflective of the value you bring to the role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

An R Shiny web developer role at Iris Software involves building interactive data visualization and analysis tools using the R programming language and the Shiny framework. The primary responsibilities include designing, developing, and deploying dashboards and applications for users to explore and manipulate data dynamically. This position also entails data manipulation, UI/UX design, and ensuring efficient application performance. Responsibilities: Building and Maintaining Shiny Applications: - Designing, developing, and maintaining interactive dashboards and applications using R and Shiny. - Implementing user interfaces (UI) and server-side logic. - Ensuring visually appealing and user-friendly applications. - Updating existing applications based on changes in functionality or requirements. - Debugging and testing applications to ensure correct functionality. Data Handling and Manipulation: - Analyzing datasets to identify relationships and prepare data for loading into databases. - Creating and managing data pipelines and ETL processes. Data Visualization and Reporting: - Developing visualizations, reports, and dashboards using R Shiny. - Creating data pipelines to integrate various data sources for comprehensive visualizations. Collaboration and Communication: - Working closely with developers, data scientists, and subject matter experts. - Communicating technical information clearly and proposing solutions to technical challenges. Mandatory Competencies: - Data Science and Machine Learning - R Shiny - User Interface - JavaScript - Communication and Collaboration - UX - Photoshop Join Iris Software to experience a supportive work environment with world-class benefits designed to support your professional and personal growth. From health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, Iris Software is committed to your success and well-being. Be part of a team that values your talent and happiness.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. The position is based in Hyderabad with work hours from 9 AM to 6 PM EST (US Time). As a Power BI Architect at AmplifAI, you will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric. You will be responsible for defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. Additionally, you will manage and mentor a team of Power BI Analysts and build engaging dashboards for platform insights, contact center KPIs, auto QA, and sentiment analysis. Integration of structured and semi-structured data for unified analysis, driving governance and CI/CD using GitHub-based workflows, and evangelizing best practices across semantic modeling, performance tuning, and data governance are key responsibilities. The ideal candidate should have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and at least 3 years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and contribute to shaping the future of analytics in CX and performance management. If you are ready to lead data-driven transformation at AmplifAI, apply now!,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As an Analyst/Consultant within the Marketing Analytics practice of Accenture Strategy & Consulting in Gurgaon, you will play a crucial role in assisting clients to grow their business by leveraging analytics to drive high performance and make informed decisions based on data insights. The Accenture Applied Intelligence practice focuses on developing analytic capabilities that enable organizations to outperform competitors, from accessing and reporting on data to predictive modeling. Joining a global network of over 20,000 colleagues, you will have the opportunity to work with leading statistical tools, methods, and applications. From data to analytics and insights to actions, you will collaborate with forward-thinking consultants to provide analytically informed insights at scale, helping clients improve outcomes and achieve high performance. In this role, you will work through various project phases, defining data requirements for Data Driven Merchandizing capabilities. Your responsibilities will include cleaning, aggregating, analyzing, and interpreting data, as well as conducting data quality analysis. With at least 3 years of experience in Data Driven Merchandizing, you will focus on Pricing/Promotions/Assortment Optimization capabilities across retail clients, including knowledge of price/discount elasticity estimation and non-linear optimization techniques. Proficiency in Statistical Timeseries models, store clustering algorithms, and descriptive analytics to support merch AI capability is essential. Hands-on experience in state space modeling, mixed effect regression, and developing AI/ML models in the Azure ML tech stack is required. You will also be responsible for managing data pipelines and data within different layers of the Snowflake environment. Additionally, you should be familiar with common design patterns for scalable machine learning architectures, tools for deploying and maintaining machine learning models in production, and cloud platforms for pipelining and deploying elasticity models. Your role will involve working alongside a team and consultant/manager, creating insights presentations, and client-ready decks. You should possess strong communication skills and be able to mentor and guide junior resources. Logical thinking is a key attribute for this role, as you will need to think analytically and use a systematic and logical approach to analyze data, problems, and situations. Your task management skills should be at a basic level, enabling you to plan your tasks, discuss priorities, track progress, and report accordingly.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Key Skills : PostgreSQL, Cron Jobs, Databricks, Azure, SSIS, Prefect, Data Pipelines, Cloud Data Migration, MSSQL. Roles and Responsibilities: Design and implement data models in PostgreSQL database on cloud environments. Build and manage transformation pipelines using Databricks for data migration from MSSQL to PostgreSQL. Schedule and manage automation using Cron jobs. Mentor and guide junior team members. Work in Azure or any cloud-based environment. Ensure successful and optimized data migration from MSSQL to PostgreSQL. Experience Requirement: 5-10 years of experience in database engineering and data migration. Hands-on experience in PostgreSQL, Cron jobs, Databricks, and Azure. Experience with data pipelines using SSIS or Prefect is preferred. Education: B.E., B.Tech.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 - 1 Lacs

Chennai

Hybrid

Duties and Responsibilities Lead the design and implementation of scalable, secure, and high-performance solutions for data-intensive applications. Collaborate with stakeholders, other product development groups and software vendors to identify and define solutions for complex business and technical requirements. Develop and maintain cloud infrastructure using platforms such as AWS, Azure, or Google Cloud. Articulate technology solutions as well as explain the competitive advantages of various technology alternatives. Evangelize best practices to analytics teams Ensure data security, privacy, and compliance with relevant regulations. Optimize cloud resources for cost-efficiency and performance. Lead the migration of on-premises data systems to the cloud. Implement data storage, processing, and analytics solutions using cloud-native services. Monitor and troubleshoot cloud infrastructure and data pipelines. Stay updated with the latest trends and best practices in cloud computing and data management" Skills 5+ years of hands-on design and development experience in implementing Data Analytics applications using AWS Services such as S3, Glue, AWS Step Functions, Kinesis, Lambda, Lake Formation, Athena, Elastic Container Service/Elastic Kubernetes Service, Elastic Search, and Amazon EMR or Snowflake Experience with AWS services such as AWS IoT Greengrass, AWS IoT SiteWise, AWS IoT Core, AWS IoT Events-Strong understanding of cloud architecture principles and best practices. Proficiency in designing network topology, endpoints, application registration, network pairing Well verse with the access management in Azure or Cloud Experience with containerization technologies like Docker and Kubernetes. Expertise in CI/CD pipelines and version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and leadership skills. Ability to work collaboratively with cross-functional teams and stakeholders. Knowledge of security and compliance standards related to cloud data platforms." Technical / Functional Skills Atleast 3+ years of experience in the implementation of all the Amazon Web Services (listed above) Atleast 3+ years of experience as a SAP BW Developer Atleast 3+ years of experience in Snowflake (or Redshift) Atleast 3+ years of experience as Data Integration Developer in Fivetran/HVR/DBT, Boomi (or Talend/Infomatica) Atleast 2+ years of experience with Azure Open AI, Azure AI Services, Microsoft CoPilot Studio, PowerBI, PowerAutomate Experience in Networking and Security Domain Expertise: 'Epxerience with SDLC/Agile/Scrum/Kanban. Project Experience Hands on experience in the end-to-end implementation of Data Analytics applications on AWS Hands on experience in the end to end implementation of SAP BW application for FICO, Sales & Distribution and Materials Management Hands on experience with Fivetran/HVR/Boomi in development of data integration services with data from SAP, SalesForce, Workday and other SaaS applications Hands on experience in the implementation of Gen AI use cases using Azure Services Hands on experience in the implementation of Advanced Analytics use cases using Python/R Certifications AWS Certified Solutions Architect - Professional

Posted 2 weeks ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 46 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

vadodara, gujarat

On-site

You will be working with Polymer, a smart data loss prevention (DLP) system that offers advanced cloud & AI data security and compliance solutions. By leveraging Polymer, you will play a crucial role in automating data protection processes, reducing data exposure risks, and enabling employees to enhance data security practices seamlessly within their existing workflows. Your responsibilities will include designing, developing, and maintaining ETL processes within large-scale data environments utilizing tools such as Snowflake and BigQuery. You will be tasked with constructing and deploying data pipelines to manage data ingestion, transformation, and loading operations from diverse sources. Additionally, you will create and manage data models and schemas optimized for performance and scalability, leveraging BI tools like QuickSight, Tableau, or Sigma to generate interactive dashboards and reports. Collaboration with stakeholders to grasp business requirements and convert them into technical solutions will be a key aspect of your role. You will communicate complex data insights clearly to both technical and non-technical audiences, proactively identify and resolve data quality issues and performance bottlenecks, and contribute to enhancing the data infrastructure and best practices within the organization. As a qualified candidate, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Computer Engineering, or a related field, along with 3-5 years of experience in a data science/engineering role. Proficiency in Python, including experience with Django or Flask, is essential, while expertise in Snowflake and BigQuery is advantageous. Experience with relational databases like MySQL or PostgreSQL, designing ETL processes in large-scale data environments, and working with cloud platforms such as AWS or GCP is highly valued. Your problem-solving and analytical skills, combined with a data-driven mindset, will be crucial in this role. Strong communication, interpersonal skills, and the ability to work both independently and collaboratively within a team are essential attributes. Familiarity with Agile development methodologies will be beneficial for success in this position. This is an onsite opportunity located in Vadodara, Gujarat, India.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

goa

On-site

As an Intern at Vumonic, your day-to-day responsibilities will involve extracting and parsing email receipts in HTML and PDF formats to identify key data points. You will be developing scripts to convert unstructured data into structured formats such as JSON and CSV. Additionally, you will be implementing regex, NLP, and AI techniques to enhance data extraction accuracy. Collaboration with the data team will be essential as you work together to refine parsing logic and automate processes. You will also be required to write SQL queries for storing, retrieving, and manipulating structured data. When necessary, you will utilize R for data cleaning, analysis, and visualization. An integral part of your role will be to explore and integrate AI/ML-based approaches to improve data extraction and validation. It is crucial to remain updated with the latest advancements in AI, NLP, and data parsing technologies. Testing, validating, and optimizing data pipelines for scalability will also be within your scope of work. Vumonic is a provider of global data and market intelligence services, enabling companies to make data-driven decisions in various domains like strategy, marketing, sales, investments, and more to achieve a higher ROI. As a rapidly growing startup specializing in data analytics, Vumonic offers a flat hierarchy, providing you with a dynamic and engaging work environment.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

Join the Agentforce team in AI Cloud at Salesforce and make a real impact with your software designs and code! This position requires technical skills, outstanding analytical and influencing skills, and extraordinary business insight. It is a multi-functional role that requires building alignment and communication with several engineering organizations. We work in a highly collaborative environment, and you will partner with a highly cross-functional team comprised of Data Scientists, Software Engineers, Machine learning engineers, UX experts, and product managers to build upon Agentforce, our cutting-edge new AI framework. We value execution, clear communication, feedback, and making learning fun. Your impact - You will: Architect, design, implement, test, and deliver highly scalable AI solutions: Agents, AI Copilots/assistants, Chatbots, AI Planners, RAG solutions. Be accountable for defining and driving software architecture and enterprise capabilities (scalability, fault tolerance, extensibility, maintainability, etc.) Independently design sophisticated software systems for high-end solutions, while working in a consultative fashion with other senior engineers and architects in AI Cloud and across the company. Determine overall architectural principles, frameworks, and standards to craft vision and roadmaps. Analyze and provide feedback on product strategy and technical feasibility. Drive long-term design strategies that span multiple sophisticated projects, deliver technical reports and performance presentations to customers and at industry events. Actively communicate with, encourage, and motivate all levels of staff. Be a subject matter expert for multiple products, while writing code and working closely with other developers, PM, and UX to ensure features are delivered to meet business and quality requirements. Troubleshoot complex production issues and interface with support and customers as needed. Drive long-term design strategies that span multiple sophisticated projects, deliver technical reports and performance presentations to customers and at industry events. Required Skills: 12+ years of experience in building highly scalable Software-as-a-Service applications/platform. Experience building technical architectures that address complex performance issues. Thrive in dynamic environments, working on cutting-edge projects that often come with ambiguity. Innovation/startup mindset to be able to adapt. Deep knowledge of object-oriented programming and experience with at least one object-oriented programming language, preferably Java. Proven ability to mentor team members to support their understanding and growth of software engineering architecture concepts and aid in their technical development. High proficiency in at least one high-level programming language and web framework (NodeJS, Express, Hapi, etc.). Proven understanding of web technologies, such as JavaScript, CSS, HTML5, XML, JavaScript, JSON, and/or Ajax. Data model design, database technologies (RDBMS & NoSQL), and languages such as SQL and PL/SQL. Experience delivering or partnering with teams that ship AI products at high scale. Experience in automated testing, including unit and functional testing using Java, JUnit, JSUnit, Selenium. Demonstrated ability to drive long-term design strategies that span multiple complex projects. Experience delivering technical reports and presentations to customers and at industry events. Demonstrated track record of cultivating strong working relationships and driving collaboration across multiple technical and business teams to resolve critical issues. Experience with the full software lifecycle in highly agile and ambiguous environments. Excellent interpersonal and communication skills. Preferred Skills: Solid experience in API development, API lifecycle management, and/or client SDKs development. Experience with machine learning or cloud technology platforms like AWS sagemaker, terraform, spinnaker, EKS, GKE. Experience with AI/ML and Data science, including Predictive and Generative AI. Experience with data engineering, data pipelines, or distributed systems. Experience with continuous integration (CI) and continuous deployment (CD), and service ownership. Familiarity with Salesforce APIs and technologies. Ability to support/resolve production customer escalations with excellent debugging and problem-solving skills.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

Role Summary: As a Senior IT Analytics Specialist at a 100% cloud-based pharma startup in Mumbai, you will be responsible for driving data-driven decision-making, AI-powered automation, and cloud-based analytics across all business functions. Your primary focus will be on utilizing data from ERP, SFA, DMS, LIMS, HRMS, and Chemist Software to generate actionable insights, predictive analytics, and AI-driven forecasting tools for pharma operations. You should have hands-on experience in BI tools, AI/ML adoption, cloud analytics, API integrations, and data governance. Additionally, you will collaborate with outsourced vendors to ensure seamless data flow, security, and analytics-driven business intelligence. Key Responsibilities: - Collaborate with AI & Data Science teams to drive real-time analytics adoption. - Implement AI-driven forecasting tools for pharma sales, inventory, and demand planning. - Develop and support LLM-powered chatbots for customer service, sales insights, and operational automation. - Ensure seamless data lake connectivity for advanced cloud analytics and BI tools (Power BI, Qlik, Tableau). - Serve as the single point of contact for all data analytics vendors and AI partners. - Negotiate SLAs, contracts, and performance benchmarks for outsourced IT analytics services. - Oversee system performance, data accuracy, and security updates for all analytics platforms. - Focus on data visualization, defining KPIs for each function, and ensuring analytics serve as a business enabler. Desired Candidate Profile - Must-Have Qualifications: - 6-8 years of experience in IT analytics, cloud BI, and AI-driven decision-making. - Expertise in data pipelines, ETL workflows, and API integrations for enterprise systems. - Strong knowledge of BI tools (Power BI, Qlik, Tableau), SQL, Python/R for data analytics. - Experience in cloud-based analytics (AWS, Azure, GCP) and data governance. - Knowledge of AI-driven insights, predictive modeling, and NLP-driven analytics tools. Good-to-Have: - Experience in pharma, healthcare, or regulated environments. - Familiarity with data privacy laws (HIPAA, GDPR, DPDP Act India). - Certifications in AWS, Azure, ITIL, CISSP, or AI/ML technologies. Required Qualification: - Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)- IT/ CS/ E&CE/ Bachelor of Computer Applications (B.C.A.) Please note that this position is based in Mumbai and offers the opportunity to work in a 100% cloud-based pharma setup as part of a giant Indian Conglomerate.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You are a skilled QA / Data Engineer with 3-5 years of experience, joining a team focused on ensuring the quality and reliability of data-driven applications. Your expertise lies in manual testing and SQL, with additional knowledge in automation and performance testing being highly valuable. Your responsibilities include performing thorough testing and validation to guarantee the integrity of the applications. Your must-have skills include extensive experience in manual testing within data-centric environments, strong SQL skills for data validation and querying, familiarity with data engineering concepts such as ETL processes, data pipelines, and data warehousing, experience in Geo-Spatial data, a solid understanding of QA methodologies and best practices for software and data testing, and excellent communication skills. It would be beneficial for you to have experience with automation testing tools and frameworks like Selenium and JUnit for data pipelines, knowledge of performance testing tools such as JMeter and LoadRunner for evaluating data systems, familiarity with data engineering tools and platforms like Apache Kafka, Apache Spark, and Hadoop, understanding of cloud-based data solutions like AWS, Azure, and Google Cloud, along with their testing methodologies. Your proficiency in SQL, JUnit, Azure, Google Cloud, communication skills, performance testing, Selenium, QA methodologies, Apache Spark, data warehousing, data pipelines, cloud-based data solutions, Apache Kafka, Geo-Spatial data, JMeter, data validation, automation testing, manual testing, AWS, ETL, Hadoop, LoadRunner, ETL processes, and data engineering will be crucial in excelling in this role.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As the Director/Head of Data Engineering for India, you will be responsible for developing and maintaining the data strategy for Singapore implementation. Your primary goal will be to create a model implementation that can be replicated across wider PBWM Organisation for compliance in other jurisdictions. You will define and execute the data engineering strategy in alignment with business goals and technology roadmaps. Collaborating with the Chief Data Officer/Chief Operating Officer, you will understand the Critical Data Elements (CDE) and establish controls around them. Your role will involve designing data models, efficient data pipelines, ensuring data quality and integrity, collaborating with data science and analytics teams, and scaling data solutions. Additionally, you will oversee data security and compliance, continuously learn and implement the latest technologies, manage and train the data engineering team, and implement cloud migration for data with appropriate hydrations. Budgeting, resource allocation, implementing data products, ensuring data reconciliation, and upholding high standards and quality in data are also key aspects of this role. In this strategic and senior leadership position, you will oversee data strategy, data engineering, data infrastructure, and data management practices within Private Banking and Wealth Management. Your responsibilities will include managing and developing the data team, delivering outstanding customer-focused service, ensuring quality and quantity are equally prioritized, adhering to policies and procedures, and advocating Barclays values and principles. You will lead effective data management, compliance, and analytics to support business goals, enhance customer experiences, and improve operational efficiencies. Recruiting, training, and developing the data engineering team, fostering collaboration and innovation, providing strategic guidance, and defining KPIs aligned with PBWM goals will be part of your duties. Collaborating with executive leadership, you will ensure data initiatives support the bank's growth, profitability, and risk management. You will oversee budgeting for data-related initiatives, allocate resources efficiently, and track performance indicators for the data engineering team and infrastructure to drive continuous improvement. The purpose of your role is to build and maintain systems that collect, store, process, and analyze data to ensure accuracy, accessibility, and security. Your accountabilities will include building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. As a Director, you are expected to manage a business function, contribute to strategic initiatives, provide expert advice, manage resourcing and budgeting, ensure compliance, and monitor external environments. Demonstrating leadership behaviours such as listening, inspiring, aligning, and developing others, along with upholding Barclays Values and Mindset, will be key to excelling in this role.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

We are looking for a highly experienced Senior AI/ML Engineer with a strong background in cloud infrastructure and IoT to join our dynamic team in Ahmedabad. As an ML Engineer/MLOps Engineer, your primary responsibility will be to lead the design, development, and deployment of AI/ML solutions on cloud platforms, integrate IoT technologies, and drive innovation in intelligent systems. Your role will involve leading the end-to-end development and deployment of AI/ML solutions on cloud platforms such as AWS for applications like predictive analytics, anomaly detection, and intelligent automation. You will also be responsible for integrating IoT sensors, devices, and protocols like MQTT into AI/ML solutions to enable intelligent decision-making, remote monitoring, and control in IoT environments. Additionally, you will architect and optimize cloud infrastructure for AI workloads, design data pipelines for IoT data ingestion, and train machine learning models for various tasks using state-of-the-art algorithms and frameworks like PyTorch. Collaboration with cross-functional teams, including data scientists, software engineers, and product managers, is essential to define project requirements, develop prototypes, and deliver scalable AI/ML solutions that align with business objectives. You will need to implement algorithms and models proficiently using programming languages like Python, R, or Java and leverage relevant libraries and frameworks for computer vision tasks. Furthermore, you should have a Bachelor's or Master's degree in Computer Science, Engineering, or related field, along with 3-5 years of proven experience as an AI/ML Engineer. Strong programming skills in Python, familiarity with AI/ML libraries/frameworks, IoT development, containerization, and serverless computing are crucial requirements for this role. A deep understanding of machine learning algorithms, model training, validation, and deployment best practices is also essential. If you are willing to relocate to Ahmedabad and meet the educational and experience requirements mentioned above, we encourage you to apply for this full-time ML Engineer/MLOps Engineer position.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

The Lead Platform Engineer - Gen AI at Elanco, based in Bengaluru, India, will be a part of the Software Engineering & Platforms team. Reporting to the IT Engineering Associate Director, you will be instrumental in driving the direction of platform and automation capabilities, specifically focusing on generative AI and its implementation at Elanco. Your role will involve collaborating with a diverse team to work on cutting-edge engineering initiatives that ensure secure, reliable, and efficient solutions using the latest technology. As a successful candidate, you must possess a highly motivated and innovative mindset, with the ability to articulate complex technical topics, collaborate with external partners, and ensure the quality delivery of solutions. You will have the opportunity to contribute to the growth of a highly skilled engineering organization and play a key role in shaping the future of GenAI capabilities at Elanco. Your responsibilities will include staying updated on the latest AI research and technologies, contributing to continuous improvement and innovation within the team, identifying opportunities for enhancing application team and developer experience, and working collaboratively with various stakeholders to deliver high-quality technical solutions. Additionally, you will be responsible for building and running GenAI capabilities, supporting distributed teams on AI/ML consumption, and maintaining robust support processes. To qualify for this role, you must have a minimum of 2+ years of hands-on experience in Generative AI and LLMs, along with a total of 8+ years of experience as a Software Engineer. Proficiency in programming languages such as Python, TensorFlow, PyTorch, and other AI/ML frameworks is essential, as is a strong understanding of neural networks, NLP, computer vision, and other AI domains. Experience with cloud platforms and AI tools (e.g., Google Cloud, Azure), familiarity with AI technologies, data pipelines, and deploying AI solutions in real-world applications are also required. It would be beneficial to have experience with Cloud Native design patterns, core technologies like Terraform, Ansible, and Packer, cloud cognitive services, AI/Embeddings technologies, modern application architecture methodologies, and API-centric design. Knowledge of authentication and authorization protocols, AI security, model evaluation, and safety will be advantageous. This role offers an exciting opportunity to work on cutting-edge technology, contribute to the growth of a new engineering organization, and make a significant impact on Elanco's AI capabilities. If you are passionate about innovation, collaboration, and driving tangible outcomes, we encourage you to apply for this role.,

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Job Summary: We are seeking a highly motivated and skilled Project Manager to lead and manage AI-driven projects from conception to deployment. The ideal candidate will bridge the gap between business, data science, and engineering teams, ensuring smooth project execution, alignment with business objectives, and timely delivery. This role requires a deep understanding of machine learning (ML) concepts and AI technologies, coupled with strong project management experience. Key Responsibilities: Project Planning & Execution: Lead the planning, execution, and delivery of machine learning and AI projects from ideation through production. Define project scope, goals, deliverables and timelines in collaboration with stakeholders. Establish clear project roadmaps, milestones, and resource allocation plans. Stakeholder Management: Serve as the primary liaison between cross-functional teams, including data scientists, ML engineers, product managers and business stakeholders. Translate complex AI/ML concepts into business-friendly language for non-technical stakeholders. Manage expectations and ensure alignment between technical capabilities and business objectives. Team Leadership & Collaboration: Coordinate cross-functional teams to drive project success, ensuring collaboration between engineers, data scientists, analysts and business teams. Ensure that team members have clear objectives and are working efficiently toward the project's success. Risk Management & Problem Solving: Identify project risks and develop mitigation strategies. Solve project bottlenecks, troubleshoot technical challenges, and ensure continuous progress. Quality Control & Compliance: Ensure quality control throughout the ML/AI project lifecycle, including model performance and validation. Ensure adherence to ethical AI practices and compliance with data privacy regulations. Reporting & Documentation: Track project progress, provide regular status reports to stakeholders and present key findings. Maintain detailed project documentation, including timelines, deliverables, and performance metrics. Qualifications: Education & Experience: Bachelors degree in Computer Science, Data Science, Engineering, or a related field. A Masters degree or higher is a plus. 3+ years of experience in project management, with at least 1-3 years of experience in AI/ML-related projects. Technical Skills: Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch), data science tools (e.g., Jupyter, R, Python), and cloud platforms (AWS, Google Cloud, Azure). Understanding of AI/ML workflows, from data collection and preprocessing to model training, deployment, and monitoring. Basic understanding of algorithms, statistical models, data pipelines, and evaluation metrics. Project Management Skills: Proven track record of managing AI or software development projects through agile or traditional project management methodologies. Strong knowledge of project management tools (e.g., Jira, Asana, Trello) and version control systems (e.g., Git). Soft Skills: Excellent communication and interpersonal skills, with the ability to lead and inspire cross-functional teams. Strong problem-solving skills and an ability to navigate through ambiguity and technical complexity. High level of organization and attention to detail.,

Posted 3 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

About Us Thoucentric is the Consulting arm of Xoriant, a prominent digital engineering services company with 5000+ employees. We are headquartered in Bangalore with presence across multiple locations in India, US, UK, Singapore & Australia Globally. As the Consulting business of Xoriant, We help clients with Business Consulting, Program & Project Management, Digital Transformation, Product Management, Process & Technology Solutioning and Execution including Analytics & Emerging Tech areas cutting across functional areas such as Supply Chain, Finance & HR, Sales & Distribution across US, UK, Singapore and Australia. Our unique consulting framework allows us to focus on execution rather than pure advisory. We are working closely with marquee names in the global consumer & packaged goods (CPG) industry, new age tech and start-up ecosystem. Xoriant (Parent entity) started in 1990 and is a Sunnyvale, CA headquartered digital engineering firm with offices in the USA, Europe, and Asia. Xoriant is backed by ChrysCapital, a leading private equity firm. Our strengths are now combined with Xoriants capabilities in AI & Data, cloud, security and operations services proven for 30 years. We have been certified as "Great Place to Work" by AIM and have been ranked as "50 Best Firms for Data Scientists to Work For". We have an experienced consulting team of over 450+ world-class business and technology consultants based across six global locations, supporting clients through their expert insights, entrepreneurial approach and focus on delivery excellence. We have also built point solutions and products through Thoucentric labs using AI/ML in the supply chain space. Job Description At Thoucentric, we work on various problem statements. The most popular ones are - Building capabilities that address a market need, basis our ongoing research efforts Solving a specific use case for a current or potential client based on challenges on-ground Developing new systems that help be a better employer and a better partner to clients All of these need the best of minds to work on them day-to-day; and we do exactly that! Your contribution to organization development is as important as outward facing consulting. We are invested in both, employee growth and client success! Responsibilities Manage end-to-end data pipelines, ensuring seamless flow and integrity of data from diverse sources to analytical systems. Repurpose existing business logic to integrate with new data sources, optimizing efficiency and consistency across different datasets. Implement robust data governance practices to maintain data quality standards and facilitate reliable analysis and reporting. Conduct thorough data validation procedures to ensure the accuracy and reliability of analytical outputs. Take ownership of designated data domains, overseeing all aspects of data management and analysis to drive informed decision-making. Perform end-to-end analysis across all digital touchpoints, including data gathering from large and complex data sets, data processing, and analysis. Conduct in-depth analysis of user behaviour, customer journeys and other relevant metrics to understand the effectiveness of digital initiatives and identify areas for improvement. Present findings from analytics and research and make recommendations to business teams. Requirements Must have Skills & Requirement: DA 0-2 YoE Good with SQL for data analysis and building end to end data pipelines. Ability to write complex queries and understanding of database concepts. Strong analytical problem solving skills and an aptitude for learning quickly. Expert in data analysis and presentation skills. Exceptional communication and collaboration skills. Education Bachelors with Post-Graduation in Management Science and related fields. 0 - 2 years of relevant experience in analytics organizations of large corporates or in consulting companies in analytics roles. Benefits What a Consulting role at Thoucentric will offer you Opportunity to define your career path and not as enforced by a manager A great consulting environment with a chance to work with Fortune 500 companies and startups alike. A dynamic but relaxed and supportive working environment that encourages personal development. Be part of One Extended Family. We bond beyond work - sports, get-togethers, common interests etc. Work in a very enriching environment with Open Culture, Flat Organization and Excellent Peer Group. Be part of the exciting Growth Story of Thoucentric! Required Skills SQL advanced sql Practice Name Digital Work Mode Hybrid Industry Consulting Corporate Office Thoucentric, The Hive, Mahadevapura Zip/Postal Code 560048 City Bangalore Country India State/Province Karnataka,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

ahmedabad, gujarat

On-site

Key Responsibilities: Technology Leadership & Strategy Define and execute the technical vision, ensuring alignment with business objectives. Oversee architecture decisions, technology stack, and software development best practices. Ensure security, scalability, and compliance of all technology solutions. Guide technical teams in best practices for .NET, MEAN, MERN, and cloud-based development. Product & Engineering Management Lead the engineering and product teams, setting priorities and ensuring timely delivery of high-quality products. Manage the full software development lifecycle (SDLC) from ideation to deployment. Collaborate with stakeholders to translate business needs into technical solutions. Manage the architecture and development of microservices-based solutions in .NET, MEAN, and MERN stacks. Establish and enforce engineering best practices, CI/CD, DevOps, and agile methodologies. Cloud & DevOps Strategy (Azure & CI/CD) Oversee Azure cloud infrastructure, ensuring high availability and cost optimization Implement and manage CI/CD pipelines for automated builds, testing, and deployments. Establish DevOps best practices, including containerization (Docker, Kubernetes) and infrastructure as code (Terraform, ARM templates). Ensure compliance with security best practices and regulatory requirements. Data Engineering & Analytics Lead the architecture and implementation of data pipelines, ETL processes, and big data solutions. Ensure proper data governance, security, and compliance. Implement AI/ML and analytics-driven solutions to enhance product offerings. Optimize data warehousing strategies using Azure Data Services, SQL, and NoSQL databases. Team Development & Leadership Recruit, mentor, and develop a high-performing technology team. Foster a culture of innovation, accountability, and continuous improvement. Ensure strong collaboration between engineering, product, UX/UI, and other business units. Scalability & Infrastructure Optimize system architecture for performance, scalability, and resilience. Ensure effective infrastructure management, including cloud services (AWS/Azure/GCP). Drive automation and efficiency in software deployment and monitoring. Stakeholder Communication & Collaboration Act as a bridge between business and technology, ensuring alignment of goals. Present technical strategies and updates to senior leadership and key stakeholders. Work closely with sales, customer success, and marketing teams to support go-to-market strategies. Innovation & Emerging Technologies Keep up with industry trends, emerging technologies, and regulatory requirements. Evaluate and implement new technologies to enhance product capabilities and efficiency. Required Qualifications: Experience: 10+ years in software development, with 5+ years in a leadership role. Technical Expertise: Strong experience in enterprise SaaS and cloud-native application development. Leadership: Proven experience in managing engineering teams and driving product innovation. Product Mindset: Experience in product lifecycle management, from ideation to execution. Industry Knowledge: Familiarity with healthcare, pharma, or compliance-driven industries is a plus. Education: Bachelors or Masters degree in Computer Science, Engineering, or a related field. Location: Ahmedabad,

Posted 3 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As a skilled AI expert, you will be responsible for partnering with Product Managers, Engineers, and other key stakeholders to deeply understand business requirements and translate them into actionable technical roadmaps. You will identify and prioritize AI use cases aligned with organizational goals, develop a scalable and sustainable implementation roadmap, and conduct ROI analysis for on-prem LLM deployments. Your role will involve creating sophisticated software designs driven by AI-powered experiences, focusing on performance, scalability, security, reliability, and ease of maintenance. You will define and develop complex enterprise applications through AI Agentic frameworks, ensuring responsiveness, responsibility, traceability, and reasoning. Utilizing modeling techniques like UML and Domain-Driven Design, you will visualize intricate relationships between components and ensure seamless integrations. Leading large-scale platform projects, you will deliver No-code workflow management, HRMS, Collaboration, Search engine, Document Management, and other services for employees. Championing automated testing, continuous integration/delivery pipelines, MLOps, and agile methodologies across multiple teams will also be a key aspect of your role. To excel in this position, you should hold a BTech/MTech/PhD in Computer Sciences with specialization in AI/ML. A proven track record of leading large-scale digital transformation projects is required, along with a minimum of 3+ years of hands-on experience in building AI-based applications using Agentic frameworks. With a minimum of 12-14 years of experience in software design & development, you should have expertise in designing and developing applications and workflows using the ACE framework. Your skillset should include hands-on experience in developing AI Agents based on heterogenous frameworks such as LangGraph, AutoGen, Crew AI, and others. You should also be proficient in selecting and fine-tuning LLMs for enterprise needs and designing efficient inference pipelines for system integration. Expertise in Python programming, developing agents/tools in an AI Agentic framework, and building data pipelines for structured and unstructured data is essential. Additionally, experience in leveraging technologies like RAG (Retrieval Augmented Generation), vector databases, and other tools to enhance AI models is crucial. Your ability to quickly learn and adapt to the changing technology landscape, combined with past experience in the .NET Core ecosystem and front-end development featuring Angular/React/Javascript/Html/Css, will be beneficial. Hands-on experience managing full-stack web applications built upon Graph/RESTful APIs/microservice-oriented architectures and familiarity with large-scale data ecosystems is also required. Additionally, you should be skilled in platform telemetry capture, ingestion, and intelligence derivation, with a track record of effectively mentoring peers and maintaining exceptional attention to detail throughout the SDLC. Excellent verbal and written communication abilities, as well as outstanding presentation and public speaking talents, are necessary to excel in this role. Please note: Beware of recruitment scams.,

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Data Engineer Junior at dotSolved, you will be responsible for designing, implementing, and managing scalable data solutions on Azure. Your primary focus will be on building and maintaining data pipelines, integrating data from various sources, and ensuring data quality and security. Proficiency in Azure services such as Data Factory, Databricks, and Synapse Analytics is essential as you optimize data workflows for analytics and reporting purposes. Collaboration with stakeholders is a key aspect of this role to ensure alignment with business goals and performance standards. Your responsibilities will include designing, developing, and maintaining data pipelines and workflows using Azure services, implementing data integration, transformation, and storage solutions to support analytics and reporting, ensuring data quality, security, and compliance with organizational and regulatory standards, optimizing data solutions for performance, scalability, and cost efficiency, as well as collaborating with cross-functional teams to gather requirements and deliver data-driven insights. This position is based in Chennai and Bangalore, offering you the opportunity to work in a dynamic and innovative environment where you can contribute to the digital transformation journey of enterprises across various industries.,

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant-Data Engineer! In this role, we are looking for candidates who have relevant years of experience in designing and developing machine learning and deep learning system . Who have professional software development experience. Hands on running machine learning tests and experiments. Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams. Design and develop data pipelines: Create efficient data pipelines to collect, process, and store large volumes of data from various sources. Implement data solutions: Develop and implement scalable data solutions using technologies like Hadoop, Spark, and SQL databases. Ensure data quality: Monitor and improve data quality by implementing validation processes and error handling. Collaborate with teams: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Optimize performance: Continuously optimize data systems for performance, scalability, and cost-effectiveness. Experience in GenAI project Qualifications we seek in you! Minimum Qualifications / Skills Bachelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Airflow, Step Functions, Ctrl M etc Languages and scripting: Python, Scala Java etc Cloud Services - AWS, GCP, Azure and Cloud Native Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Qualifications/ Skills Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies