Home
Jobs
Companies
Resume

13124 Etl Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 7.0 years

8 - 9 Lacs

Pune

Work from Office

Naukri logo

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.

Posted 20 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 20 hours ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Very good experience on Continuous Flow Graph tool used for point based development. Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Over all 8 Years and Relevant 5+ years Extract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems. Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience Effective communication and presentation skills. Industry expertise / specialization

Posted 20 hours ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences

Posted 20 hours ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Work with broader team to build, analyze and improve the AI solutions. You will also work with our software developers in consuming different enterprise applications Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Resource should have 5-7 years of experience. Sound knowledge of Python and should know how to use the ML related services. Proficient in Python with focus on Data Analytics Packages. Strategy Analyse large, complex data sets and provide actionable insights to inform business decisions. Strategy Design and implementing data models that help in identifying patterns and trends. Collaboration Work with data engineers to optimize and maintain data pipelines. Perform quantitative analyses that translate data into actionable insights and provide analytical, data-driven decision-making. Identify and recommend process improvements to enhance the efficiency of the data platform. Develop and maintain data models, algorithms, and statistical models Preferred technical and professional experience Experience with conversation analytics. Experience with cloud technologies Experience with data exploration tools such as Tableu

Posted 20 hours ago

Apply

8.0 - 13.0 years

10 Lacs

Hyderabad

Work from Office

Naukri logo

Skill required: Marketing Operations - Campaign Management Designation: Bus & Technology Delivery Assoc Manager Qualifications: Any Graduation Years of Experience: Minimum 10+ Language - Ability: English(Domestic) - Expert About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Help balance increased marketing complexity and diminishing marketing resources. Drive marketing performance with deep functional and technical expertise, while accelerating time-to-market and operating efficiencies at scale through Data and Technology, Next Generation Content Services, Digital Marketing Services & Customer Engagement and Media Growth Services.Role requires Digital Marketing Ads & Promotion creation/designThe planning, executing, tracking and analysis of direct marketing campaigns. These tasks span the entire lifecycle of a marketing campaign, from inception to launch to evaluation of result. What are we looking for 8+ years of experience in operations and people managementExperience managing digital marketing or technical support teamsDeep understanding of Google Ad Manager or similar ad tech platformsStrong financial acumen ability to track budgets, profitability, and forecastingExcellent client management and communication skillsHigh-level stakeholder engagement across internal and external teamsAdvanced proficiency in data analysis tools (Excel, Sheets, Looker Studio, etc.)Proven ability to drive strategic initiatives and process improvementsExpertise in workforce planning, shift optimization, and productivity tracking Certification in Digital Marketing is preferred Roles and Responsibilities: Own and lead program delivery across multiple regions and shiftsMaintain end-to-end accountability for operational excellence, SLA adherence, and qualityServe as the primary point of contact for client escalations, meetings, and feedback loopsAnalyze financials to ensure profitability, cost control, and investment planningCollaborate closely with client stakeholders to align on KPIs and roadmap initiativesMentor team leads and mid-level managers to build leadership pipelineGuide team on prioritization of escalations, process gaps, and automation opportunitiesDrive quarterly planning, innovation pipeline, and strategic goals for the programPartner with QA, MIS, and Comms to ensure cohesive program successLead transformation and automation initiatives to increase efficiency and client value Qualification Any Graduation

Posted 20 hours ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

Naukri logo

Report and analyze finance and operational data from multiple sources to drive decision-making. Utilize Power BI for data visualization (DAX, M language). Normalize and manage various data sets for accuracy and future impact assessment. Collaborate with teams to enhance dashboard automation. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3-5 years of experience in Reporting and data analytics. Advanced Power BI skills. Advanced SQL knowledge Python knowledge for automation is a plus. Strong analytical skills with a focus on data quality. Preferred technical and professional experience Good Communication skills Understand the data model and how the database works (primary + foreign key, relations) Able to read and edit HTML/CSS code Basics of SQL Salesforce Marketing Cloud knowledge

Posted 20 hours ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 20 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as a Client Analytics Associate Take on a new challenge in Data & Analytics and help us shape the future of our business You’ll be helping to manage the analysis of complex data to identify business issues and opportunities, and supporting the delivery of high quality business solutions We're committed to mapping a career path that works for you, with a focus on helping you build new skills and engage with the latest ideas and technologies in data analytics We're offering this role at associate level What you'll do As a Data & Analytics Analyst, you’ll be planning and providing high quality analytical input to support the development and implementation of innovative processes and problem resolution. You’ll be capturing, validating and documenting business and data requirements, making sure they are in line with key strategic principles. We’ll look to you to interrogate, interpret and visualise large volumes of data to identify, support and challenge business opportunities and identify solutions. You’ll Also Be Performing data extraction, storage, manipulation, processing and analysis Conducting and supporting options analysis, identifying the most appropriate solution Helping to maintain full traceability and linkage of business requirements of analytics outputs Seeking opportunities to challenge and improve current business processes, ensuring the best result for the customer Creating and executing quality assurance at various stages of the project in order to validate the analysis and to ensure data quality, identify data inconsistencies, and resolve as needed The skills you'll need You’ll need a background in business analysis tools and techniques, along with the ability to influence through communications tailored to a specific audience. Additionally, you’ll need the ability to use core technical skills. You’ll Also Demonstrate Strong analytic and problem solving abilities A keen eye for detail in your work Strong proficiency in T-SQL (writing complex queries, stored procedures, view, functions) using SQL Server Experience with SSIS (SQL Server Integration Services), building and maintaining ETL pipelines Experience in designing and developing interactive Tableau dashboard and reports, ability to translate business requirements into effective visualizations Show more Show less

Posted 21 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Vestas is a major player in wind technology and a motivation in the development of the wind power industry. Vestas' core business comprises the development, manufacture, sale, marketing, and maintenance of Wind Turbines. Come and join us at Vestas! Digital Solutions & Development > Digital Architecture & Data & AL , Data Domains & AI > Data Domain - Tech Area Responsibilities Create and maintain scalable data pipelines for analytics use cases assembling large, complex data sets that meet functional & non-functional business requirements Develop logical & physical data models using optimal data model structure for data warehouse and data mart designs to support analytical needs Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality Hands-on role (100%) - building data solutions using best practices and architecture recommendations Qualifications Bachelor's / Master's in engineering (Degree in Computer Science, IT, Engineering or similar) Work experience as Data Engineer as part of Data & Analytics team, with 3+ years of relevant work experience and an overall experience of 6-10 years Data Engineering Experience: Advanced working SQL knowledge and experience in building & maintaining scalable ETL/EL data pipelines to support continuing increase in data volume and complexity Enterprise working experience in business intelligence/analytics teams supporting design, development, and maintenance of backend data layer for BI/ML solutions Deep understanding of data structure / data models to design and develop data solutions ensuring data availability, security, and accessibility Competencies Tools/Technologies/Frameworks: Expertise in working with various Data Warehouse solutions and constructing data products using technologies such as Snowflake, Databricks, Azure Data Engineering Stack (like storage accounts, key vaults, MS SQL, etc.) is mandatory Strong work experience in SQL/Stored procs and relational modeling to build data layer for BI/analytics is mandatory Extensive hands-on data modelling experience in cloud data warehouse and data structures. Hands-on working experience in one of the ETL/EL tools like DBT/Azure Data Factory/SSIS will be an advantage Proficiency in code management / version control tools such as GIT, DevOps Business/Soft Skills: Strong in data/software engineering fundamentals; experience in an Agile/Scrum environment preferred Ability to communicate with stakeholders across different geographies and collaborate with analytics & data science teams to match technical solutions with customer business requirements Familiar with business metrics such as KPIs, PPIs and other indicators Curious and passionate about building value-creating and innovative data solutions What We Offer An opportunity to impact climate change and the future of next generations through data, analytics, cloud and machine learningSteep learning curve. We are building a strong team of Data Engineers with both broad and deep knowledge. That means that everyone will have somebody to learn from, just as we will invest in continuous learning, knowledge sharing and upskilling Strong relationships. We will strive to build an environment of mutual trust and a tightly knit team, where we can support and inspire each other to deliver great impact for Vestas Opportunity to shape your role. We have been asked to scale and deliver data & insights products. The rest is up to us Healthy work life balance. Commitment to fostering a diverse and inclusive workplace environment where everyone can thrive and bring their unique perspectives and skills to the team Overall, we offer you the opportunity to make a difference and work in a multicultural international company, where you have the opportunity to improve your skills and grow professionally to reach new heights Additional Information Your primary workplace will be Chennai. Please note: We do amend or withdraw our jobs and reserve the right to the right to do so at any time, including prior to the advertised closing date. Please be advised to apply on or before 16th July 2025. BEWARE – RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https://www.vestas.com/en/careers/our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Vestas is the energy industry’s global partner on sustainable energy solutions. We are specialised in designing, manufacturing, installing, and servicing wind turbines, both onshore and offshore. Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution – today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives. We also aim to give everyone equal access to opportunity. To learn more about our company and life at Vestas, we invite you to visit our website at www.vestas.com and follow us on our social media channels. We also encourage you to join our Talent Universe to receive notifications on new and relevant postings. Show more Show less

Posted 21 hours ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Position: Solution Architect Location: Chennai/ Bangalore/ Kuala Lumpur Experience: 8+ years Employment Type: Full-time Job Overview Join Moving Walls, a trailblazer in the Out-of-Home (OOH) advertising and AdTech ecosystem, as a Solution Architect. This pivotal role places you at the heart of our innovative journey, designing and implementing scalable, efficient, and transformative solutions for our award-winning platforms like LMX and MAX . With a focus on automating and enhancing media transactions, you’ll enable a seamless connection between media buyers and sellers in a rapidly evolving digital-first landscape. As a Solution Architect, you will bridge the gap between business objectives and technical execution, working in an Agile environment with POD-based execution models to ensure ownership and accountability. You will drive initiatives that revolutionize the way data and technology shape OOH advertising. Why Join Us? ● Innovative Vision: Be part of a team committed to "Creating the Future of Outernet Media", where every solution impacts global markets across Asia, ANZ, Africa, and more. ● Cutting-edge Projects: Work on features like programmatic deal automation, data-driven audience insights, and dynamic campaign management for platforms connecting billions of ad impressions. ● Collaborative Culture: Collaborate with multidisciplinary teams, including Sales, Product Management, and Engineering, to craft solutions that are customized and impactful. What You’ll Do: ● Architect scalable and innovative solutions for AdTech products, ensuring alignment with organizational goals and market needs. ● Collaborate with cross-functional teams to gather, analyze, and translate business requirements into technical designs. ● Lead the development of programmatic solutions, dynamic audience segmentation tools, and integrations for global markets. ● Enhance existing products by integrating advanced features like dynamic rate cards, bid management, and inventory mapping. ● Advocate for best practices in system design, ensuring the highest standards of security, reliability, and performance. What You Bring: ● A strong technical background with hands-on experience in cloud-based architectures, API integrations, and data analytics. ● Proven expertise in working within an Agile environment and leading POD-based teams to deliver high-impact results. ● Passion for AdTech innovation and the ability to navigate complex, fast-paced environments. ● Excellent problem-solving skills, creativity, and a customer-centric mindset. Key Responsibilities 1. Solution Design: ○ Develop end-to-end solution architectures for web, mobile, and cloud-based platforms using the specified tech stack. ○ Translate business requirements into scalable and reliable technical solutions. 2. Agile POD-Based Execution: ○ Collaborate with cross-functional POD teams (Product, Engineering, QA, and Operations) to deliver iterative and focused solutions. ○ Ensure clear ownership of deliverables within the POD, fostering accountability and streamlined execution. ○ Contribute to defining and refining the POD stages to ensure alignment with organizational goals. 3. Collaboration and Stakeholder Management: ○ Work closely with product, engineering, and business teams to define technical requirements. ○ Lead technical discussions with internal and external stakeholders. 4. Technical Expertise: ○ Provide architectural guidance and best practices for system integrations, APIs, and microservices. ○ Ensure solutions meet non-functional requirements like scalability, reliability, and security. 5. Documentation: ○ Prepare and maintain architectural documentation, including solution blueprints and workflows. ○ Create technical roadmaps and detailed design documentation. 6. Mentorship: ○ Guide and mentor engineering teams during development and deployment phases. ○ Review code and provide technical insights to improve quality and performance. 7. Innovation and Optimization: ○ Identify areas for technical improvement and drive innovation in solutions. ○ Evaluate emerging technologies to recommend the best tools and frameworks. Required Skills and Qualifications ● Bachelor’s/Master’s degree in Computer Science, Information Technology, or a related field. ● Proven experience as a Solution Architect or a similar role. ● Expertise in programming languages and frameworks: Java, Angular, Python, C++ ● Proficiency in AI/ML frameworks and libraries such as TensorFlow, PyTorch, Scikit-learn, or Keras. ● Experience in deploying AI models in production, including optimizing for performance and scalability. ● Understanding of deep learning, NLP, computer vision, or generative AI techniques. ● Hands-on experience with model fine-tuning, transfer learning, and hyperparameter optimization. ● Strong knowledge of enterprise architecture frameworks (TOGAF, Zachman, etc.). ● Expertise in distributed systems, microservices, and cloud-native architectures. ● Experience in API design, data pipelines, and integration of AI services within existing systems. ● Strong knowledge of databases: MongoDB, SQL, NoSQL. ● Proficiency in working with large-scale datasets, data wrangling, and ETL pipelines. ● Hands-on experience with CI/CD pipelines for AI development. ● Version control systems like Git and experience with ML lifecycle tools such as MLflow or DVC. ● Proven track record of leading AI-driven projects from ideation to deployment. ● Hands-on experience with cloud platforms (AWS, Azure, GCP) for deploying AI solutions. ● Familiarity with Agile methodologies, especially POD-based execution models. ● Strong problem-solving skills and ability to design scalable solutions. ● Excellent communication skills to articulate technical solutions to stakeholders. Preferred Qualifications ● Experience in e-commerce, Adtech or OOH (Out-of-Home) advertising technology. ● Knowledge of tools like Jira, Confluence, and Agile frameworks like Scrum or Kanban. ● Certification in cloud technologies (e.g., AWS Solutions Architect). Tech Stack ● Programming Languages: Java, Python or C++ ● Frontend Framework: Angular ● Database Technologies: MongoDB, SQL, NoSQL ● Cloud Platform: AWS ● Familiarity with data processing tools like Pandas, NumPy, and big data frameworks (e.g., Hadoop, Spark). ● Experience with cloud platforms for AI (AWS SageMaker, Azure ML, Google Vertex AI). ● Understanding of APIs, microservices, and containerization tools like Docker and Kubernetes. Share your profile to kushpu@movingwalls.com Show more Show less

Posted 22 hours ago

Apply

12.0 years

0 Lacs

Chandigarh

On-site

Requirements Essential Job Functions/Responsibilities: 12 + Years of overall experiance Minimum 3 to 4 years of experience of leading Data Engineer teams developing enterprise grade data processing pipelines on Google Cloud. Has lead at least one project of medium to high complexity of migratiing ETL pipelines and Data warehouses to cloud. 3 to 5 years of latest experience should be with premium consulting companies Indept hands-on expertise with Google Cloud Platform services esp - BigQuery, Dataform, Dataplex etc. Exceptional communication skills to converse equally well with Data Engineers, Technology and Business leadership. Ability to leaverage knowledge on GCP to other clould environments. Forecasting, classification, Regression techniques Job Types: Full-time, Permanent, Fresher Pay: ₹400,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

10.0 years

3 - 5 Lacs

Cochin

On-site

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills.

Posted 22 hours ago

Apply

9.0 - 12.0 years

5 - 10 Lacs

Thiruvananthapuram

On-site

9 - 12 Years 1 Opening Kochi, Trivandrum Role description Job Title: Data Architect / Cloud Data Specialist Role Overview: Leverage your expertise in data architecture and cloud technologies (AWS, Azure, GCP) to design, develop, and implement data extraction, transformation, and reporting solutions. Drive architecture for small to mid-size projects, support teams in developing proofs of concept (POCs), and ensure alignment with business workflows and data strategies. Key Responsibilities: Architect and implement ETL/data warehouse solutions, data pipelines, and cloud data tools. Understand business workflows, design data acquisition, transformation, and modelling strategies. Develop data models, dashboards, and business intelligence solutions. Define data governance standards including naming conventions, security, backup, and recovery. Guide estimation, troubleshooting, and POC development for customer-specific solutions. Collaborate with project teams to ensure data architecture aligns with business goals and regulatory compliance. Mentor team members and promote knowledge sharing and best practices. Stay current with emerging technologies and obtain relevant certifications (AWS/Azure/GCP/Big Data/ML). Skills & Experience: Strong experience with database architecture, ETL, data warehousing, and cloud platforms (AWS, Azure, GCP). Proficient in data workflow design, data modelling, and BI tools. Familiarity with big data technologies, scripting languages (Python, Java), and SQL. Knowledge of project management, agile practices, and ability to support medium-sized projects. Analytical mindset with strong problem-solving and attention to detail. Certifications in cloud or big data technologies preferred. Skills Database Architecture,Aws,Azure,Gcp About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 22 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Cochin

On-site

Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Data Engineer Locations- Kochi/Chennai/Coimbatore/Mumbai/Pune/Hyderabad Job Overview : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will have deep expertise in Azure Databricks and Python, and experience building scalable data pipelines. Familiarity with Data Fabric architectures is a plus. You'll work closely with data scientists, analysts, and business stakeholders to deliver robust data solutions that drive insights and innovation. Key Responsibilities: Design, build, and maintain large-scale, distributed data pipelines using Azure Databricks and Py Spark. Design, build, and maintain large-scale, distributed data pipelines using Azure Data Factory Develop and optimize data workflows and ETL processes in Azure Cloud environments. Write clean, maintainable, and efficient code in Python for data engineering tasks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. • Monitor and troubleshoot data pipelines for performance and reliability issues. • Implement data quality checks, validations, and ensure data lineage and governance. Contribute to the design and implementation of a Data Fabric architecture (desirable). Required Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5–10 years of experience in data engineering or related roles. • Expertise in Azure Databricks, Delta Lake, and Spark. • Strong proficiency in Python, especially in a data processing context. Experience with Azure Data Lake, Azure Data Factory, and related Azure services. Hands-on experience in building data ingestion and transformation pipelines. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Good to Have: Experience or understanding of Data Fabric concepts (e.g., data virtualization, unified data access, metadata-driven architectures). • Knowledge of modern data warehousing and lakehouse principles. • Exposure to tools like Apache Airflow, dbt, or similar. Experience working in agile/scrum environments. DP-500 and DP-600 Certifications What We Offer: Competitive salary and performance-based bonuses. Flexible work arrangements. Opportunities for continuous learning and career growth. A collaborative, inclusive, and innovative work culture. www.orioninc.com (21) Orion Innovation: Company Page Admin | LinkedIn Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC and its subsidiaries and its affiliates (collectively, "Orion," "we" or "us") are committed to protecting your privacy. This Candidate Privacy Policy (orioninc.com) ("Notice") explains: What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.

Posted 22 hours ago

Apply

0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

We are looking for skilled ETL pipeline support engineer to join DevOps team. In this role, you will be ensuring the smooth operation of PROD ETL pipelines. Also responsible for monitoring, troubleshooting existing pipelines. This role requires a strong understanding of SQL, Spark, and experience with AWS Glue and Redshift . Required Skills and Experience: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in supporting and maintaining ETL pipelines. Strong proficiency in SQL and experience with relational databases (e.g., Redshift). Solid understanding of distributed computing concepts and experience with Apache Spark. Hands-on experience with AWS Glue and other AWS data services (e.g., S3, Lambda). Experience with data warehousing concepts and best practices. Excellent problem-solving, analytical skills and strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills and Experience: Experience with other ETL tools and technologies Experience with scripting languages (e.g., Python). Familiarity with Agile development methodologies. Experience with data visualization tools (e.g., Tableau, Power BI). Roles & Responsibilities Monitor and maintain existing ETL pipelines, ensuring data quality and availability. identify and resolve pipeline issues and data errors. Troubleshoot data integration processes. If needed, collaborate with data engineers and other stakeholders to resolve complex issues Develop and maintain necessary documentation for ETL processes and pipelines. Participate in on-call rotation for production support. Show more Show less

Posted 22 hours ago

Apply

5.0 - 8.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

About KPI Partners . KPI Partners is a leading provider of technology consulting and solutions, specializing in delivering high-quality services that enable organizations to optimize their operations and achieve their strategic objectives. We are committed to empowering businesses through innovative solutions and a strong focus on customer satisfaction. Job Description. We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities. - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.). - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment. Why Join KPI Partners? - Opportunity to work with a talented and diverse team on cutting-edge projects. - Competitive salary and comprehensive benefits package. - Continuous learning and professional development opportunities. - A culture that values innovative thinking and encourages collaboration. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.**

Posted 22 hours ago

Apply

0 years

0 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Principal Consultant, AWS DataLake ! Responsibilities Having knowledge on DataLake on AWS services with exposure to creating External Tables and spark programming. The person shall be able to work on python programming. Writing effective and scalable Python codes for automations, data wrangling and ETL. Designing and implementing robust applications and work on Automations using python codes. Debugging applications to ensure low-latency and high-availability. Writing optimized custom SQL queries Experienced in team and client handling Having prowess in documentation related to systems, design, and delivery. Integrate user-facing elements into applications Having the knowledge of External Tables, Data Lake concepts. Able to do task allocation, collaborate on status exchanges and getting things to successful closure. Implement security and data protection solutions Must be capable of writing SQL queries for validating dashboard outputs Must be able to translate visual requirements into detailed technical specifications Well versed in handling Excel, CSV, text, json other unstructured file formats using python. Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Good understanding and exposure on any Git, Bamboo, Confluence and Jira. Good in Dataframes and SQL ANSI using pandas. Team player, collaborative approach and excellent communication skills Qualifications we seek in you! Minimum Qualifications BE/B Tech/ MCA Excellent written and verbal communication skills Good knowledge of Python, Pyspark Preferred Qualifications/ Skills Strong ETL knowledge on any ETL tool – good to have. Good to have knowledge on AWS cloud and Snowflake. Having knowledge of PySpark is a plus. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Master's / Equivalent Job Posting Jun 16, 2025, 12:21:32 AM Unposting Date Dec 13, 2025, 4:21:32 AM Master Skills List Digital Job Category Full Time

Posted 22 hours ago

Apply

3.0 - 10.0 years

5 - 18 Lacs

India

On-site

Overview: We are looking for a skilled GCP Data Engineer with 3 to 10 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities:  Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs.  Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services.  Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements.  Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows.  Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services.  Develop and implement data and semantic interoperability specifications.  Work closely with business teams to define and scope requirements.  Analyze existing systems to identify appropriate data sources and drive continuous improvement.  Implement and continuously enhance automation processes for data ingestion and data transformation.  Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines.  Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications:  Overall 3-10 years of hands-on experience as a Data Engineer, with at least 2-3 years of direct GCP Data Engineering experience.  Strong SQL and Python development skills are mandatory.  Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies.  Demonstrated knowledge and experience with Google Cloud BigQuery is a must.  Experience with DataProc and Dataflow is highly preferred.  Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks.  Extensive experience in SQL across various database platforms.  Any BI tools Experience is also preferred.  Experience in data mapping and data modeling.  Familiarity with data analytics tools and best practices.  Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell.  Practical experience with Google Cloud services including but not limited to: o BigQuery, BigTable o Cloud Dataflow, Cloud Dataproc o Cloud Storage, Pub/Sub o Cloud Functions, Cloud Composer o Cloud Spanner, Cloud SQL  Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark).  Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc.  Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP.  GCP Data Engineer Certification is highly preferred. Job Type: Full-time Pay: ₹500,298.14 - ₹1,850,039.92 per year Benefits: Health insurance Schedule: Rotational shift Work Location: In person

Posted 22 hours ago

Apply

3.0 years

8 - 9 Lacs

Hyderābād

On-site

You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Technology Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and PySpark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, ECS Cluster and ECS Apps Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred qualifications, capabilities, and skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS.

Posted 22 hours ago

Apply

6.0 years

0 Lacs

Hyderābād

On-site

Job Information Date Opened 06/16/2025 Job Type Full time Industry IT Services City BENGALURU,HYDERABAD State/Province BENGALURU Country India Zip/Postal Code BENGALURU Job Description As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to Conduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of ODI partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand source applications and how it can be integrated analyze data sets to understand functional and business context create Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status and risks to key stakeholders Lead the team to design, build, test and deploy Support client needs by delivering ODI jobs and frameworks Merge, Customize and Deploy ODI data model as per client business requirements Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients Focus on designing, building, and documenting re-usable code artifacts Track, report and optimize ODI jobs performance to meet client SLA Designing and architecting ODI projects including upgrade/migrations to cloud Design and implement security in ODI Identify risks and suggest mitigation plan Ability to lead the team and mentor junior practitioners Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts Perform system analysis, follow technical design and work on development activities Participate in design meetings, daily standups, backlog grooming Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans Validate the environment to meets all security and compliance controls Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions Ideally, you should also have Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premise to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability Experience in defining new architectures and ability to drive project from architecture standpoint Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills Strong communicator Willingness to travel in case of project requirement Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers

Posted 22 hours ago

Apply

8.0 years

28 - 30 Lacs

Hyderābād

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Salesforce Architect Location: Hyderabad (Onsite) Experience Required: 10+ Years Employment Type: Full-time Job Summary: We are seeking a highly skilled and experienced Salesforce Architect to lead the design, development, and implementation of scalable and high-performance Salesforce solutions. The ideal candidate will have deep expertise in Salesforce architecture, strong technical and leadership skills, and a proven track record of delivering enterprise-level CRM solutions. Key Responsibilities: Lead end-to-end architecture and solution design for Salesforce implementations. Collaborate with business stakeholders, product owners, and developers to define Salesforce solutions that align with business goals. Design scalable, secure, and high-performing solutions across Sales Cloud, Service Cloud, Experience Cloud, and other Salesforce platforms. Evaluate existing Salesforce implementations and provide recommendations for improvement and optimization. Create and maintain architectural documentation including data models, integration patterns, and system diagrams. Guide development teams on best practices and enforce coding standards. Support and guide integrations with external systems using REST/SOAP APIs, middleware, and ETL tools. Conduct code reviews, performance tuning, and quality assurance. Stay updated on new Salesforce features and functionality and provide recommendations for process improvements. Required Skills & Qualifications: 8+ years of experience in Salesforce platform development and architecture. Proven experience as a Salesforce Technical or Solution Architect. Expertise in Sales Cloud, Service Cloud, Lightning Framework, and Apex development. Strong knowledge of Salesforce data modeling, sharing rules, workflows, and validation rules. Hands-on experience with integrations (REST/SOAP APIs, Middleware like Mulesoft or Dell Boomi). Familiarity with DevOps tools like Copado, Gearset, Jenkins. Experience with agile development methodologies. Salesforce Architect certifications preferred (e.g., Application Architect, System Architect, Technical Architect). Preferred Qualifications: Salesforce Certified Application/System/Technical Architect. Experience with CI/CD tools and processes. Strong communication and stakeholder management skills. Experience in mentoring and leading technical teams. Show more Show less

Posted 22 hours ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Are you… Interested in working for an international and diverse company? Looking to use your troubleshooting skill? Interested in developing your career in a leading packaging and printing industry? If so, read on! Esko (https://www.esko.com/en/company/about-esko) , a Veralto company, is a global provider of integrated software and hardware solutions that accelerate the go-to-market process of packaged goods. Our products empower teams to support and manage the packaging design and print processes for brand owners, retailers, pre-media and trade shops, manufacturers, and converters to provide the most innovative, integrated platform and comprehensive portfolio of tools that intelligently digitize, connect, automate, and accelerate the concept to market processes for every packaged product. You will be part of a flexible, family friendly organization that cares about its people just as it cares about the environment. We recognize that people come with a wealth of experience and talent. Diversity of experience and skills combined with passion is a key to innovation and excellence. Therefore, we encourage people from all backgrounds to apply to our positions. The Senior Solution Architect is responsible for the end-to-end design of enterprise software implementations. Provide coaching on assessing the business impact versus effort. Collaborate with other senior solution architects to define the overall framework and standards for our solutions. Monitor and evaluate the outcome and effectiveness of the solutions on the business operations. Will be working on a Hybrid Model (Quarterly visit to Bangalore office) In This Role, a Typical Day Will Look Like: Understand and Design the integration Write Custom Scripts and API and Middleware solutions Create and Provide data migration requirment Strong Problem skills and troubleshooting The Essential Requirements Of The Job Include: Lead the end-to-end design of enterprise software implementations with an emphasis on the business impact and value. Provide coaching on assessing the business impact versus effort and prioritizing different functional requests based on the impact-effort analysis Collaborate with other senior solution architects to define the overall framework and standards for our solutions Create and implement software systems using our product, best practices, and services, ensuring they meet the requirements of scalability, performance, and maintainability Monitor and evaluate the outcome and effectiveness of the solutions on the business operations Identify, manage, and mitigate technical risks and issues, and communicate them to the relevant stakeholders Ensure the quality and consistency of the solutions, and adhere to our SILC standards and best practices and help in improving our quality practices Stay updated with the latest product releases and emerging solutions and help in creating training materials for juniors Provide thought leadership and consultancy to senior management, customers, and internal teams Mentor and coach solution architects and other team members Play the role of Scrum master to monitor projects Ensure quality assurance and QMS compliance by following the standard operating procedures (SOPs), best practices, and UX guidelines Provide guidance, feedback, and training to the solution architects and development teams Break down high-level objectives into specific software development tasks and follow through on their execution High level understating and reviewing of epics and user stories. Breaking down of user stories to development tasks and estimating those stories Create feasibility analysis and provide accurate effort estimates. Ownership to develop plan and deliver to meet schedule and quality expectation of product owner Write code that is easy to understand and fits with existing patterns Conduct peer code reviews and perform unit and integration testing Implement unit testing, documentation of API, integration testing, deployment approach, and maintain programming documentation which include commenting code and user guides Analyze, debug and solve complex technical issues and customer escalations Provide status updates, and produce high quality deliverables in a timely manner Communicate and interact effectively with team members, product owner, QA, support, and other departments within Esko Expertise in Esko products and services, as well as industry standards and best practices (relating to WebCenter, MediaBeacon, Automation Engine) Packaging domain experience is considered a plus The Essential Requirements Of The Job Include: 8 - 11 total experience in total and Minimum 4 years of Development experience Strong in technologies like Javascript, ETL, SQL, API, web services, Jira Strong Communication . At Esko (https://jobs.veralto.com/global/en/esko) , a Veralto Company (https://www.veralto.com/who-we-are/) , innovation comes in every color and never in the same package. Join Esko and see how diversity of people and thought fuels a career journey like no other. Create unique technology solutions for the packaging value chain, bring new ideas to life, make and influence decisions, and experience career growth, rewards, and recognition in our global Packaging & Color organizations. Esko is proud to be a Product Quality & Innovation company in Veralto (NYSE: VLTO). Imagine a world where everyone has access to clean water, safe food and medicine, and trusted essential goods. That is the tomorrow Veralto is creating today. Veralto is a $5B global leader in essential technology solutions made up of over 16,000 associates across our Water Quality and Product Identification _ _ segments all united by a powerful purpose: Safeguarding the World’s Most Vital Resources. At Veralto, we value diversity and the existence of similarities and differences, both visible and not, found in our workforce, workplace and throughout the markets we serve. Our associates, customers and shareholders contribute unique and different perspectives as a result of these diverse attributes. If you’ve ever wondered what’s within you, there’s no better time to find out. Unsolicited Assistance We do not accept unsolicited assistance from any headhunters or recruitment firms for any of our job openings. All resumes or profiles submitted by search firms to any employee at any of the Veralto companies (https://www.veralto.com/our-companies/) , in any form without a valid, signed search agreement in place for the specific position, approved by Talent Acquisition, will be deemed the sole property of Veralto and its companies. No fee will be paid in the event the candidate is hired by Veralto and its companies because of the unsolicited referral. Veralto and all Veralto Companies are committed to equal opportunity regardless of race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity, or other characteristics protected by law. We value diversity and the existence of similarities and differences, both visible and not, found in our workforce, workplace and throughout the markets we serve. Our associates, customers and shareholders contribute unique and different perspectives as a result of these diverse attributes. Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

Hyderābād

On-site

Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Role Overview As a Data Engineering Lead, you will be responsible for overseeing and guiding the data engineering team in developing, optimizing , and maintaining our data infrastructure. You will play a critical role in ensuring the seamless integration and flow of data across the organization, enabling data-driven decision-making and analytics. Key Responsibilities Data Integration: Coordinate with various teams to ensure seamless data integration across the organization's systems. ETL Processes: Develop and implement efficient data transformation and ETL (Extract, Transform, Load) processes. Performance Optimization: Optimize data flow and system performance for enhanced functionality and efficiency. Data Security: Ensure adherence to data security protocols and compliance standards to protect sensitive information. Infrastructure Management: Oversee the development and maintenance of the data infrastructure, ensuring scalability and reliability. Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven initiatives. Innovation: Stay updated with the latest trends and technologies in data engineering and implement best practices. Qualifications Experience: Proven experience in data engineering, with a strong background in leading and managing teams. Technical Skills: Proficiency in programming languages such as Python, Java, and SQL, along with experience in big data technologies like Hadoop, Spark, and Kafka. Data Management: In-depth understanding of data warehousing, data modeling, and database management systems. Analytical Skills: Strong analytical and problem-solving skills with the ability to handle complex data challenges. Communication: Excellent communication and interpersonal skills, capable of working effectively with cross-functional teams. Education: Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Why Join Us? Work on cutting-edge data projects and contribute to the organization's data strategy. Collaborative and innovative work environment that values creativity and continuous learning. If you are a strategic thinker with a passion for data engineering and leadership, we would love to hear from you. Apply now to join our team and make a significant impact on our data-driven journey. #LI-India Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant.

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies