Jobs
Interviews

267 Data Pipelines Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

5 - 9 Lacs

Kochi

Work from Office

We are looking for a skilled professional with 3 to 8 years of experience to join our team as an EY Data Engineer in Bengaluru. The ideal candidate will have a strong background in data engineering, analytics, and reporting. ### Roles and Responsibility Collaborate with cross-functional teams to design and implement data solutions. Develop and maintain large-scale data pipelines using tools like Azure Data Factory and Azure Synapse. Design and implement data models and architectures to support business intelligence and analytics. Work with stakeholders to understand business requirements and develop solutions that meet their needs. Ensure data quality and integrity by implementing data validation and testing procedures. Optimize data storage and retrieval processes for improved performance and efficiency. ### Job Requirements Strong knowledge of data modeling, architecture, and visualization techniques. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Proficiency in programming languages like Python, Java, and SQL. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Strong understanding of data governance principles and practices. A B.Tech/B.E. degree is required; higher professional or master’s qualification is preferred. Active membership in related professional bodies or industry groups is preferred.

Posted 2 months ago

Apply

10 - 15 years

20 - 25 Lacs

Kolkata

Work from Office

We are looking for a skilled Solution Architect with 10 to 15 years of experience to join our team in Bengaluru. The role involves designing and implementing scalable, reliable, and high-performing data architecture solutions. ### Roles and Responsibility Design and develop data architecture solutions that meet business requirements. Collaborate with stakeholders to identify needs and translate them into technical data solutions. Provide technical leadership and support to software development teams. Define and implement data management policies, procedures, and standards. Ensure data quality and integrity through data cleansing and validation. Develop and implement data security and privacy policies, ensuring compliance with regulations like GDPR and HIPAA. Design and implement data migration plans from legacy systems to the cloud. Build data pipelines and workflows using Azure services such as Azure Data Factory, Azure Databricks, and Azure Stream Analytics. Develop and maintain data models and database schemas aligned with business requirements. Evaluate and select appropriate data storage technologies including Azure SQL Database, Azure Cosmos DB, and Azure Data Lake Storage. Troubleshoot data-related issues and provide technical support to data users. Stay updated on the latest trends and developments in data architecture and recommend improvements. Coordinate and interact with multiple teams for smooth operations. ### Job Requirements Proven experience as a Technical/Data Architect with over 10 years of product/solutions development experience. Hands-on experience with software/product architecture, design, development, testing, and implementation. Excellent communication skills, problem-solving aptitude, organizational, and leadership skills. Experience with Agile development methodology and strategic development/deployment methodologies. Understanding of source control (Git/VSTS), continuous integration/continuous deployment, and information security. Hands-on experience with Cloud-based (Azure) product/platform development and implementation. Good experience in designing and working with Data Lakes, Data Warehouses, and ETL tools (Azure based). Expertise in Azure Data Analytics with a thorough understanding of Azure Data Platform tools. Hands-on experience and good understanding of Azure services like Data Factory, Data Bricks, Synapse, Data Lake Gen2, Stream Analytics, Azure Spark, Azure ML, SQL Server DB, Cosmos DB. Hands-on experience in Information management and Business Intelligence projects, handling huge client data sets with functions including transfer, ingestion, processing, analyzing, and visualization. Excellent communication and problem-solving skills, and the ability to work effectively in a team environment.

Posted 2 months ago

Apply

9 - 13 years

14 - 19 Lacs

Kolkata

Work from Office

We are looking for a skilled professional with strong AI Enabled Automation skills and interest in applying AI in the process automation space. The ideal candidate will have 9 to 13 years of relevant experience. This position is based in Bengaluru. ### Roles and Responsibility Develop and implement AI enabled automation solutions aligned with business objectives. Design and deploy Proof of Concepts (POCs) and Points of View (POVs) across various industry verticals, demonstrating the potential of AI enabled automation applications. Ensure seamless integration of optimized solutions into the overall product or system. Collaborate with cross-functional teams to understand requirements and integrate solutions into cloud environments. Educate team members on best practices and stay updated on the latest tech advancements to bring innovative solutions to projects. Work closely with stakeholders to identify opportunities for process improvements and implement changes using AI technologies. ### Job Requirements Proficiency in Python and frameworks like PyTorch, TensorFlow, Hugging Face Transformers. Strong foundation in ML algorithms, feature engineering, and model evaluation. Experience with Deep Learning, Neural Networks, RNNs, CNNs, LSTMs, Transformers (BERT, GPT), and NLP. Experience in GenAI technologies such as LLMs (GPT, Claude, LLaMA), prompting, fine-tuning. Knowledge of retrieval augmented generation (RAG) and knowledge graph RAG. Experience with multi-agent orchestration, memory, and tool integrations. Experience implementing MLOps practices and tools (CI/CD for ML, containerization, orchestration, model versioning, and reproducibility). Experience with cloud platforms (AWS, Azure, GCP) for scalable ML model deployment. Good understanding of data pipelines, APIs, and distributed systems. Build observability into AI systems – latency, drift, performance metrics. Strong written and verbal communication, presentation, client service, and technical writing skills in English for both technical and business audiences. Strong analytical, problem-solving, and critical thinking skills. Ability to work under tight timelines for multiple project deliveries.

Posted 2 months ago

Apply

8 - 11 years

14 - 19 Lacs

Thiruvananthapuram

Work from Office

We are looking for a skilled professional with 8 to 11 years of industry experience to lead our migration of data analytics environment from Teradata to Snowflake, focusing on performance and reliability. The ideal candidate will have strong technical expertise in big data engineering and hands-on experience with Snowflake. ### Roles and Responsibility Lead the migration of data analytics environments from Teradata to Snowflake, emphasizing performance and reliability. Design and deploy big data pipelines in a cloud environment using Snowflake Cloud DW. Develop and migrate existing on-prem ETL routines to Cloud Services. Collaborate with senior leaders to understand business goals and contribute to workstream delivery. Design and optimize model codes for faster execution. Work with cross-functional teams to ensure seamless integration of data analytics solutions. ### Job Requirements Minimum 8 years of experience as an Architect on Analytics solutions. Strong technical experience with Snowflake, including modeling, schema, and database design. Experience integrating with third-party tools, ETL, and DBT tools. Proficiency in programming languages such as Java, Scala, or Python. Excellent communication skills, both written and verbal, with the ability to communicate complex technical concepts effectively. Flexible and proactive working style with strong personal ownership of problem resolution. A computer science graduate or equivalent is required.

Posted 2 months ago

Apply

3 - 6 years

14 - 19 Lacs

Bengaluru

Work from Office

We are looking for a skilled Data Analyst with 3 to 6 years of experience to join our team in Bengaluru. The ideal candidate will be responsible for transforming data into actionable insights and supporting decision-making in a global organization focused on pricing and commercial strategy. ### Roles and Responsibility Analyze and interpret structured and unstructured data using statistical and quantitative methods to generate actionable insights and ongoing reports. Design and implement data pipelines and processes for data cleaning, transformation, modeling, and visualization using tools such as Power BI, SQL, and Python. Collaborate with stakeholders to define requirements, prioritize business needs, and translate problems into analytical solutions. Develop, maintain, and enhance scalable analytics solutions and dashboards that support pricing strategy and commercial decision-making. Identify opportunities for process improvement and operational efficiency through data-driven recommendations. Communicate complex findings clearly and concisely to both technical and non-technical audiences. ### Job Requirements Proven experience as a data analyst, business analyst, data engineer, or similar role. Strong analytical skills with the ability to collect, organize, analyze, and present large datasets accurately. Foundational knowledge of statistics, including concepts like distributions, variance, and correlation. Skilled in documenting processes and presenting findings to both technical and non-technical audiences. Hands-on experience with Power BI for designing, developing, and maintaining analytics solutions. Proficient in both Python and SQL, with strong programming and scripting skills. Skilled in using Pandas, T-SQL, and Power Query M for querying, transforming, and cleaning data. Hands-on experience in data modeling for both transactional (OLTP) and analytical (OLAP) database systems. Strong visualization skills using Power BI and Python libraries such as Matplotlib and Seaborn. Experience with defining and designing KPIs and aligning data insights with business goals. A bachelor’s degree in a STEM field relevant to data analysis, data engineering, or data science is required. 3–6 years of experience in data analysis, data engineering, or a closely related field, ideally within a professional services environment. ### Additional Info The company offers a dynamic work environment and opportunities for growth and development.

Posted 2 months ago

Apply

3 - 7 years

14 - 19 Lacs

Hyderabad

Work from Office

We are looking for a highly skilled and experienced Data Engineer with 3 to 7 years of experience to join our team in Bengaluru. The ideal candidate will have a strong background in building data pipelines using Google Cloud Platform (GCP) and hands-on experience with BigQuery, GCP SDK, and API scripting. ### Roles and Responsibility Design, develop, and implement data pipelines using BigQuery and GCP SDK. Build and orchestrate data fusion pipelines for data migration from various databases. Develop scripts in Python and write technical architecture documentation. Implement monitoring architecture and test GCP services. Participate in Agile and DevOps concepts, including CI/CD pipelines and test-driven frameworks. Collaborate with cross-functional teams to deliver high-quality solutions. ### Job Requirements Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in data engineering, preferably with GCP. Strong understanding of application architecture and programming languages. Experience with AWS and/or Azure and/or GCP, along with a proven track record of building complex infrastructure programmatically. Strong scripting and programming skills in Python and Linux Shell. Experience with Agile and DevOps concepts, as well as CI/CD pipelines and test-driven frameworks. Certification in Google Professional Cloud Data Engineer is desirable. Proactive team player with good English communication skills. Good understanding of Application Architecture. Experience working on CMMI / Agile / SAFE methodologies. Experience working with AWS and/or Azure and/or GCP and a proven track record of building complex infrastructure programmatically with IaC tooling or vendor libraries. Strong communication and written skills. Experience creating technical architecture documentation. Experience in Linux OS internals, administration, and performance optimization.

Posted 2 months ago

Apply

3 - 7 years

9 - 14 Lacs

Kochi

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our team, with 3-7 years of experience in modern data ecosystems. The ideal candidate will have hands-on proficiency in Informatica CDI, Azure Data Factory (ADF), Azure Data Lake (ADLS), and Databricks. ### Roles and Responsibility Provide daily Application Management Support for the full data stack, addressing service requests, incidents, enhancements, and changes. Lead and coordinate resolution of complex data integration and analytics issues through thorough root cause analysis. Collaborate with technical and business stakeholders to support and optimize data pipelines, models, and dashboards. Maintain detailed documentation including architecture diagrams, troubleshooting guides, and test cases. Remain flexible for shift-based work or on-call duties depending on client needs and critical business periods. Ensure seamless operation of data platforms and timely resolution of incidents. ### Job Requirements Bachelor’s degree in Computer Science, Engineering, Data Analytics, or related field, or equivalent work experience. Strong understanding of data governance, performance tuning, and cloud-based data architecture best practices. Excellent stakeholder collaboration skills to translate business needs into scalable technical solutions. Solid understanding of data pipeline management and optimization techniques. Experience integrating data from various sources including ERP, CRM, POS, and third-party APIs. Familiarity with DevOps/CI-CD pipelines in a data engineering context. Certifications such as Informatica Certified Developer, Microsoft Certified: Azure Data Engineer Associate, and Databricks Certified Data Engineer are preferred. Passionate, proactive problem solvers with a strong client orientation. Professionals eager to learn and grow in a fast-paced, global delivery environment. A chance to work alongside a world-class, multidisciplinary team delivering data excellence to global businesses.

Posted 2 months ago

Apply

2 - 7 years

9 - 13 Lacs

Kochi

Work from Office

We are looking for a highly skilled and experienced Azure Data Engineer with 2 to 7 years of experience to join our team. The ideal candidate will have expertise in Azure Synapse Analytics, PySpark, Azure Data Factory, ADLS Gen2, SQL DW, T-SQL, and other relevant technologies. ### Roles and Responsibilities Design, develop, and implement data pipelines using Azure Data Factory or Azure Synapse Analytics. Develop and maintain data warehouses or data lakes using various tools and technologies. Build workflows and pipelines in Azure Synapse Analytics to support business intelligence and analytics. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing data validation and testing procedures. Troubleshoot and resolve technical issues related to data engineering and analytics. ### Job Requirements Hands-on experience in Azure Data Factory or Azure Synapse Analytics is required. Experience in handling data in datastores such as Azure SQL, T-SQL, and SQL DW is necessary. Ability to work with various types of data sources including flat files, JSON, and databases. Strong analytical, interpersonal, and collaboration skills are essential. Fair knowledge of Spark, Python, and DWH concepts is expected. Experience in CI/CD and build automations for deployment is preferred.

Posted 2 months ago

Apply

4 - 7 years

6 - 10 Lacs

Kolkata

Work from Office

We are looking for a highly skilled and experienced Senior Data Scientist to join our team in Bengaluru. The ideal candidate will have 4-7 years of experience in data science, with a strong background in machine learning, deep learning, and natural language processing. ### Roles and Responsibility Develop and implement innovative AI solutions using Python and other programming languages. Collaborate with cross-functional teams to design and deploy scalable data pipelines. Conduct complex data analysis and provide actionable insights to stakeholders. Design and develop predictive models to drive business growth and improvement. Work closely with the Solution Architects on deploying AI POCs and scaling them up to production-level applications. Extract data from complex PDF/Word Docs/Form entities, tables, and information comparison. ### Job Requirements Excellent academic background in data science, Business Analytics, Statistics, Engineering, Operational Research, or a related field. Strong proficiency in Python, with excellent coding skills and experience in deploying open-source models. Experience in machine learning, deep learning, and natural language processing. Good understanding of SQL/NoSQL Databases and their manipulation components. Ability to coordinate multiple projects and initiatives simultaneously through effective prioritization and organization. Proactive, organized, and self-sufficient with the ability to prioritize and multitask. Knowledge of firm’s reporting tools and processes. Demonstrated project management experience. A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be part of market-leading, multi-disciplinary team of 7200+ professionals, in the only integrated global assurance business worldwide. Opportunities to work with EY GDS Assurance practices globally with leading businesses across a range of industries.

Posted 2 months ago

Apply

10 - 18 years

12 - 22 Lacs

Pune, Bengaluru

Hybrid

Hi, We are hiring for the role of AWS Data Engineer with one of the leading organization for Bangalore & Pune. Experience - 10+ Years Location - Bangalore & Pune Ctc - Best in the industry Job Description Technical Skills PySpark coding skill Proficient in AWS Data Engineering Services Experience in Designing Data Pipeline & Data Lake If interested kindly share your resume at nupur.tyagi@mounttalent.com

Posted 2 months ago

Apply

6 - 8 years

30 - 40 Lacs

Gurugram

Work from Office

Job Title: Senior Product Manager/ Principal Product Manager Platform & Data Products (GenAI) Location: Gurgaon Experience: 68 Years About the Role: We are looking for a highly motivated and strategic Product Manager with 68 years of experience in building platform products and data-driven solutions , and at least 612 months of hands-on experience with Generative AI (GenAI) . You will own the end-to-end product lifecyclefrom ideation and strategy to execution and go-to-marketfor core data and AI platform components, enabling internal teams and external partners to build powerful, scalable solutions. You will own the product roadmap for platform and data products, ensuring alignment with business strategy. Key Responsibilities: Work closely with engineering, data science, design, and business teams to build scalable APIs, SDKs, and platform capabilities. Define, prioritize, and deliver product features leveraging GenAI (e.g., RAG pipelines, LLM-based services, agentic workflows). Partner with customers, stakeholders, and GTM teams to understand user needs and translate them into platform capabilities. Design user journeys and technical workflows for data ingestion, transformation, access, and governance. Ability to articulate product value propositions to both technical and non-technical stakeholders. Experience launching B2B or SaaS products, including self-serve onboarding, usage analytics, and growth loops. Familiarity with product-led growth (PLG) strategies and tools to drive user engagement and retention. Drive experimentation and measurement to validate product decisions using data and feedback loops. Ensure strong documentation and developer experience for all platform interfaces and APIs. Requirements: 6–8 years of experience in product management , with at least 3+ years building platforms or data products (APIs, developer tools, infrastructure, or internal data platforms). 6–12 months of experience in shipping GenAI features or products (LLMs, prompt engineering, embeddings, fine-tuning, RAG, agent-based systems). Strong understanding of data architecture, data pipelines, governance, and developer experience. Excellent communication skills with a track record of working with cross-functional teams in agile environments. Experience working with cloud platforms (AWS, GCP, or Azure), data lakes, vector databases (e.g., Pinecone, Weaviate), and modern AI/ML toolkits. Comfortable writing product specs and user stories, and collaborating with engineers on technical implementation. Nice to Have: Experience in go-to-market planning, including pricing, packaging, and positioning of platform or SaaS products. Exposure to customer-facing roles, such as working with sales, pre-sales, or customer success to support product adoption and feedback loops. Background in creating sales enablement collateral, product demos, and customer onboarding journeys.

Posted 2 months ago

Apply

9 - 12 years

30 - 35 Lacs

Hyderabad

Work from Office

What you will do In this vital role We are seeking a highly experienced and strategic Specialist IS Architect Search to lead the design and governance of search and knowledge discovery architecture across the organization. As a Software Engineer specializing in Data Engineering, you will be responsible for designing, developing, and optimizing data processing systems and applications. Your work will directly impact on how we collect, process, and utilize data, enabling the company to make data-driven decisions. You will collaborate with cross-functional teams, including data scientists, product managers, and business stakeholders, to ensure that our data infrastructure is robust, scalable, and efficient. Design frameworks for indexing, metadata enrichment, semantic understanding, and query optimization across structured and unstructured data. Define and maintain architectural blueprints and roadmaps for search platforms (e.g., Elasticsearch, OpenSearch, vector search, or LLM-enhanced systems). Building and integrating information systems to meet the companys needs. Work with distributed systems, databases, and cloud technologies to build and maintain data lakes, data warehouses, and other data architectures. Collaborate closely with product owners, architects, business SMEs and engineers to develop and deliver high-quality solutions, enhancing and maintaining integrations across clinical systems Design and architect the next-generation metrics engine on modern infrastructure to support operational analytics leveraging cloud technologies Assessing the systems architecture currently in place and working with technical staff to recommend solutions to improve it. Analyze functional and technical requirements of applications, translating them into software architecture and design specifications. Develop and execute unit tests, integration tests, and other testing strategies to ensure software quality and reliability. Integrate systems and platforms to ensure seamless data flow, functionality, and interoperability. Resolving technical problems as they arise. Providing technical guidance and mentorship to junior developers. Assessing the business impact that certain technical choices have. Providing updates to stakeholders on product development processes, costs, and budgets. Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly Current understanding of best practices regarding system security measures Positive outlook in meeting challenges and working to a high level Possesses strong rapid prototyping skills and can quickly translate concepts into working code Take ownership of complex software projects from conception to deployment. Manage software delivery scope, risk, and timeline Participate in both front-end and back-end development using cloud technology. Develop innovative solutions using generative AI technologies Conduct code reviews to ensure code quality and alignment to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges effectively. Stay updated with the latest trends and advancements Work closely with product teams, business teams, and other key partners. Basic Qualifications: Degree in computer science & engineering preferred with 9-12 years of software development experience Strong hands-on experience with data processing frameworks such as Apache Spark, Hadoop, and Apache Kafka. Experience working with cloud platforms such as AWS, Google Cloud, or Azure, and services like S3, Redshift, BigQuery, and similar technologies. Hands on experience with various cloud services, understand pros and cons of various cloud services in well architected cloud design principles Hands on experience with Full Stack software development. Proficient in programming language React Framework Redux, RESTful API Development, Swagger OpenAPI, TypeScript, Fast Python, Java Script, SQL/NoSQL, Databricks/RDS, ETL, Tableau, Apache, Kafka, data pipelines Strong programming skills in Python, Java, Scala Experience with cloud platforms (AWS, Azure, Google Cloud) and related services (e.g., S3, Redshift, BigQuery, Databricks) Experience with data streaming frameworks (Apache Kafka, Flink). Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Preferred Qualifications: Programming Languages: Proficiency in multiple languages (e.g., Python, Java, Scala, JavaScript, React) is crucial and must Experienced with API integration, serverless, microservices architecture. Experience in Data bricks, PySpark, Spark, SQL, ETL, Kafka Experienced with AWSAzure Platform, Building and deploying the code Experience in Postgres SQLMongo DB SQL database, vector database for large language models, Databricks or RDS Experience in vector databases for large language models, Databricks or RDS Experience with Web site development, understanding of web site localization processes, which involve adapting content to fit cultural and linguistic contexts. Experience with DevOps CICD build and deployment pipeline Experience in Agile software development methodologies Strategic Thinking: Contributing to the overall strategic direction of the software development process. Experience in API and End to End testing as part of Test-Driven Development Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Lang chain or llama Index framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 2 months ago

Apply

12 - 16 years

25 - 35 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Hybrid

We're looking for an experienced Data Engineer Architect with expertise in AWS technologies, to join our team in India. If you have a passion for analytics and a proven track record of designing and implementing complex data solutions, we want to hear from you. Location : Noida/ Gurgaon/ Bangalore/ Mumbai/ Pune Your Future Employer : Join a dynamic and inclusive organization at the forefront of technology, where your expertise will be valued and your career development will be a top priority. Responsibilities: Designing and implementing robust, scalable data pipelines and architectures using AWS technologies Collaborating with cross-functional teams to understand data requirements and develop solutions to meet business needs Optimizing data infrastructure and processes for improved performance and efficiency Providing technical leadership and mentorship to junior team members, and driving best practices in data engineering Requirements: 12+ years of experience in data engineering, with a focus on AWS technologies Strong proficiency in analytics and data processing tools such as SQL, Spark, and Hadoop Proven track record of designing and implementing large-scale data solutions Experience in leading and mentoring teams, and driving technical best practices Excellent communication skills and ability to collaborate effectively with stakeholders at all levels Whats in it for you: Competitive compensation and benefits package Opportunity to work with cutting-edge technologies and make a real impact on business outcomes Career growth and development in a supportive and inclusive work environment Reach us - If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at isha.joshi@crescendogroup.in Disclaimer - Crescendo Global is specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status or disability status. Note -We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Profile keywords : Data Engineer, Architect, AWS, Analytics, SQL, Spark, Hadoop, Kafka, Crescendo Global.

Posted 2 months ago

Apply

15 - 24 years

15 - 30 Lacs

Bengaluru

Remote

Immediate opening for "AI Architect (Generative AI)" Location : Remote Experience :- 15 to 23 yrs Quantaleap INC. is seeking a highly experienced AI Architect with deep expertise in Generative AI to drive innovative solution design and architecture for enterprise use cases. This role requires a strategic thinker with a strong foundation in AI/ML technologies and a proven ability to lead high-impact AI initiatives across diverse domains. Key Responsibilities: Architect end-to-end AI solutions leveraging Generative AI technologies (LLMs, diffusion models, transformers, etc.). Evaluate and recommend cutting-edge tools, frameworks, and models suitable for enterprise-grade deployments. Collaborate with cross-functional teams for RFP responses, technical proposals, and client presentations. Lead the development of scalable AI solutions from POC to production across multiple industry verticals. Guide technical teams on best practices in AI model development, fine-tuning, and deployment. Must-Have Skills: 15+ years of overall experience with a strong focus on AI/ML architecture In-depth knowledge of Generative AI technologies, including LLMs (GPT, BERT), GANs, and foundation models Experience in deploying AI models in cloud-native and on-prem environments Strong understanding of MLOps, data pipelines, and model lifecycle management Proven track record in presales, RFP responses, and enterprise solutioning Excellent stakeholder management and client-facing communication skills Interested can share their updated resume to anitha.mudaliyar@quantaleap.com

Posted 2 months ago

Apply

6 - 10 years

15 - 19 Lacs

Bengaluru

Work from Office

As a Principal Data Engineer on the Marketplace team, you will be responsible for analysing and interpreting complex datasets to generate insights that directly influence business strategy and decision-making. You will apply advanced statistical analysis and predictive modelling techniques to identify trends, predict future outcomes, and assess data quality. These insights will drive data-driven decisions and strategic initiatives across the organization. The Marketplace team is responsible for building the services where our customers will go to purchase pre-configured software installations on the platform of their choice. The challenges here are across the entire stack, from back-end distributed services operating at cloud scale, to e-commerce transactions, to the actual web apps that users interact with. This is the perfect role for someone experienced in designing distributed systems, writing and debugging code across an entire stack (UI, APIs, databases, cloud infrastructure services), championing operational excellence, mentoring junior engineers, driving development process improvements and excellence in a start-up style environment. Career Level - IC4 Responsibilities As a Principal Data Engineer, you will be at the forefront of Oracles data initiatives, playing a pivotal role in transforming raw data into actionable insights. Collaborating with data scientists and business stakeholders, you will design scalable data pipelines, optimize data infrastructure, and ensure the availability of high-quality datasets for strategic analysis. This role goes beyond data engineering, requiring hands-on involvement in statistical analysis and predictive modeling. You will use techniques such as regression analysis, trend forecasting, and time-series modeling to extract meaningful insights from data, directly supporting business decision-making. Basic Qualifications: 7+ years of experience in data engineering and analytics, with a strong background in designing scalable database architectures, building and optimizing data pipelines, and applying statistical analysis to deliver strategic insights across complex, high-volume data environments Deep knowledge of big data frameworks such as Apache Spark, Apache Flink, Apache Airflow, Presto, Kafka, and data warehouse solutions. Experience working with other cloud platform teams and accommodating requirements from those teams (compute, networking, search, store). Excellent written and verbal communication skills with the ability to present complex information in a clear, concise manner to all audiences Design and optimize database structures to ensure scalability, performance, and reliability within Oracle ADW and OCI environments. This includes maintaining schema integrity, managing database objects, and implementing efficient table structures that support seamless reporting and analytical needs. Build and manage data pipelines that automate the flow of data from diverse sources into Oracle databases, using ETL processes to transform data for analysis and reporting. Conduct data quality assessments, identify anomalies, and validate the accuracy of data ingested into our systems. Working alongside data governance teams, you will establish metrics to measure data quality and implement controls to uphold data integrity, ensuring reliable data for stakeholders. Mentor junior team members and share best practices in data analysis, modeling, and domain expertise. Preferred Qualifications: Solid understanding of statistical methods, hypothesis testing, data distribution, regression analysis, and probability. Proficiency in Python for data analysis and statistical modeling. Experience with libraries like pandas, NumPy, and SciPy. Knowledge of methods and techniques for data quality assessment, anomaly detection, and validation processes. Skills in defining data quality metrics, creating data validation rules, and implementing controls to monitor and uphold data integrity. Familiarity with visualization tools (e.g., Tableau, Power BI, Oracle Analytics Cloud) and libraries (e.g., Matplotlib, Seaborn) to convey insights effectively. Strong communication skills for collaborating with stakeholders and translating business goals into technical data requirements. Ability to contextualize data insights in business terms and to present findings to non-technical stakeholders in a meaningful way. Ability to cleanse, transform, and aggregate data from various sources, ensuring its ready for analysis. Experience with relational database management and design, specifically in Oracle environments (e.g., Oracle Autonomous Data Warehouse, Oracle Database). Skills in designing, maintaining, and optimizing database schemas to ensure efficiency, scalability, and reliability. Advanced SQL skills for complex queries, indexing, stored procedures, and performance tuning. Experience with ETL tools such as Oracle Data Integrator (ODI), or other data integration frameworks.

Posted 2 months ago

Apply

6 - 11 years

11 - 15 Lacs

Bengaluru

Work from Office

Job Description About Oracle and OHAI Building off our Cloud momentum, Oracle has formed a new organization - Oracle Health & AI. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. About ODA Within the larger Oracle Health & AI organization, the Oracle Digital Assistant (ODA) is an Assistant platform that enables developers to create their own skills and digital assistants using the Conversational AI and Generative AI capabilities exposed to them in a low-code paradigm. The ODA team has been powering conversational AI experiences for all internal Oracle teams and several external customers for the past 7 years and is evolving rapidly to provide Generative AI solutions for the healthcare and enterprise domains. We operate in an agile environment with a substantial charter, high visibility, and support from senior leadership. About the team We are looking for ML scientists, Data Engineers, Software Development Engineers, Product/Program Managers and ML engineers with a strong background in machine learning to build industry-leading solutions. You will work with huge volumes of data to solve/enable solving real-world problems in the Healthcare domain that are productionized into our Health AI products. Your solutions will be productized as features in the end products. These are new teams that we are putting together in India Center from scratch and the work will span across New Feature Development for CDA Specialty Expansion & Innovation, Language & Region Expansion, Software Development across front-end, back-end, test automation and mobile, ML Engineering, MLOps, LLMOps, Applied ML Sciences and ML Research, with a huge focus on GenAI capabilities. There is a huge emphasis on Large Language Models (LLMs) and Generative AI. Your contributions will be pivotal in delivering our new Generative AI-powered solutions for healthcare and enterprise customers. Qualifications Ph.D. Degree preferred. Experience having led multiple projects leveraging LLMs, GenAI and Prompt Engineering Exposure to real-world MLOps deploying models into production adding features to products Knowledge of working in a cloud environment 5+ Years of Experience Strong understanding of LLMs, GenAI, Prompt Engineering and Copilot Career Level - IC4 Responsibilities Responsibilities Strong expertise in Python programming and one of the Deep Learning frameworks (PyTorch, MXNet, TensorFlow, Keras) and understanding of LLMs, LLMOps and Generative AI - to be able to deliver per specifications; complete assignments without assistance by exercising judgment & appropriate actions Knowledge of Classification, Prediction, Recommender Systems, Time Series Forecasting, Anomaly Detection, Optimization, Graph ML, NLP. Hands-on develop ML models using the above techniques Ability to both use existing libraries (ML, Deep Learning, Re-inforcement Learning) as well as design algorithms ground-up; Able to prepare data pipelines and feature engineering pipelines to build robust models Combine ML Sciences depth, programming expertise and mathematical understanding of techniques to deliver state-of-the-art ML solutions for problem solving across Healthcare domain Drives selection of methods and techniques to drive solutions Demonstrate strong program management to be able to multi-task effectively Be able to mentor & lead junior data scientists effectively Contribute to peer reviews, team learning, meeting product objectives and develop best practices for the organization Strong research record preferred demonstrated, through publications Scientific thinking and ability to invent: Prior experience creating intellectual property through patents desirable Identify and investigate new technologies, prototype and test solutions for product features and design/validate patterns that deliver a compelling experience Take responsibility for technical problem solving to meet product objectives and develop best practices Ability to resolve complex business problems in creative and effective ways Strong experience in LLM Systems and LLMOps. Prior experience with Prompt Engineering and RAG systems

Posted 2 months ago

Apply

8 - 13 years

15 - 30 Lacs

Bengaluru

Work from Office

Design, develop, and maintain scalable ETL pipelines, data lakes, and hosting solutions using Azure tools. Ensure data quality, performance optimization, and compliance across hybrid and cloud environments. Required Candidate profile Data engineer with experience in Azure data services, ETL workflows, scripting, and data modeling. Strong collaboration with analytics teams and hands-on pipeline deployment using best practices

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies