Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
9 - 13 Lacs
Hyderabad
Work from Office
About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and understanding.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data architecture principles and best practices.- Experience with data integration techniques and tools.- Familiarity with cloud-based data platforms and services.- Ability to troubleshoot and resolve data-related issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Data Building Tool.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Mumbai, Mangaluru
Hybrid
6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.
Posted 3 weeks ago
4.0 - 6.0 years
5 - 15 Lacs
Bengaluru
Work from Office
About Apexon: Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients’ toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital Job Title : Databricks ETL Developer Experience : 4–6 Years Location: Hybrid, preferably in Bangalore Job Description: We are seeking a skilled Databricks ETL Developer with 4 to 6 years of experience in building and maintaining scalable data pipelines and transformation workflows on the Azure Databricks platform. Key Responsibilities: Design, develop, and optimize ETL pipelines using Azure Databricks (Spark). Ingest data from various structured and unstructured sources (Azure Data Lake, SQL DBs, APIs). Implement data transformation and cleansing logic in PySpark or Scala. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, performance tuning, and error handling in data workflows. Schedule and monitor ETL jobs using Azure Data Factory or Databricks Workflows. Participate in code reviews and maintain coding best practices. Required Skills: Hands-on experience with Azure Databricks, Spark (PySpark/Scala). Strong ETL development experience handling large-scale data. Proficient in SQL and working with relational databases. Familiarity with Azure Data Lake, Data Factory, Delta Lake. Experience with version control tools like Git. Good understanding of data warehousing concepts and data modeling. Preferred: Experience in CI/CD for data pipelines. Exposure to BI tools like Power BI for data validation. Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work®, the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We take affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com) Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental LeaveoLearning & Career Development Employee Wellness
Posted 3 weeks ago
5.0 - 10.0 years
10 - 16 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
Designation : Senior Data Engineer Experience: 5+ Years Location: Navi Mumbai (JUINAGAR) - WFO Immediate Joiners preferred. Interview : Face - 2 - Face (Only 1 Day Process) Job Description We are looking for an experienced and results-driven Senior Data Engineer to join our Data Engineering team . In this role, you will design, develop, and maintain robust data pipelines and infrastructure that enable efficient data flow across our systems. As a senior contributor, you will also help define best practices, mentor junior team members, and contribute to the long-term vision of our data platform. You will work closely with cross-functional teams to deliver reliable, scalable, and high-performance data systems that support critical business intelligence and analytics initiatives. Required Qualifications: Bachelors degree in Computer Science, Information Systems, or a related field; Masters degree is a plus. 5+ years of experience in data warehousing, ETL development, and data modeling. Strong hands-on experience with one or more databases: Snowflake, Redshift, SQL Server, Oracle, Postgres, Teradata, BigQuery. Proficiency in SQL and scripting languages (e.g., Python, Shell). Deep knowledge of data modeling techniques and ETL frameworks. Excellent communication, analytical thinking, and troubleshooting skills. Preferred Qualifications Experience with modern data stack tools like dbt, Fivetran, Stitch, Looker, Tableau,or PowerBI. Knowledge of data lakes, lakehouses, and real-time data streaming (e.g., Kafka). Agile/Scrum project experience and version control using Git. Sincerely, Sonia TS
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled Ab Initio Developer with over 6 years of total experience and at least 4 years of relevant experience. Your primary responsibilities will include leading the design, development, and optimization of ETL processes using Ab Initio's Graphical Development Environment (GDE). It is essential to ensure data accuracy, consistency, and availability throughout the data integration workflows. You will be tasked with building, maintaining, and optimizing data integration workflows to facilitate seamless data flow across various systems and platforms. Your expertise in designing intricate data transformations, data cleansing, and data enrichment logic within Ab Initio graphs will be critical. Utilizing Ab Initio's metadata capabilities for documenting data lineage, transformations, and data definitions is essential to ensure transparency and compliance. Monitoring and optimizing Ab Initio ETL processes for efficiency, scalability, and performance will be part of your routine. You must address and resolve any bottlenecks that may arise. Developing robust error handling and logging mechanisms to track and manage ETL job failures and exceptions is crucial to maintain the integrity of data processes. Collaboration with cross-functional teams, including data engineers, data analysts, data scientists, and business stakeholders, is necessary. Understanding requirements and ensuring successful delivery of data integration projects will be a key aspect of your role. Using version control systems such as Git to manage Ab Initio code and collaborate effectively with team members is essential. Creating and maintaining comprehensive documentation of Ab Initio graphs, data integration processes, best practices, and standards for the team is expected. You will also be responsible for investigating and resolving complex ETL-related issues, providing support to team members and users, and conducting root cause analysis when problems arise. Overall, as an Ab Initio Developer, you will be a vital part of the data engineering team, contributing to the design, development, and maintenance of data integration and ETL solutions using Ab Initio's suite of tools.,
Posted 3 weeks ago
3.0 - 12.0 years
0 Lacs
kolkata, west bengal
On-site
As a Cloud DB Engineer, you will be responsible for designing and developing data pipelines to collect, transform, and store data from various sources in order to support analytics and business intelligence. Your role will involve integrating data from multiple sources, including databases, APIs, and third-party tools, to ensure consistency and accuracy across all data systems. You will also be tasked with designing, implementing, and optimizing both relational and non-relational databases to facilitate efficient storage, retrieval, and processing of data. Data modeling will be a key aspect of your responsibilities, where you will develop and maintain data models that represent data relationships and flows, ensuring structured and accessible data for analysis. In addition, you will design and implement Extract, Transform, Load (ETL) processes to clean, enrich, and load data into data warehouses or lakes. Monitoring and optimizing performance of data systems, including database query performance, data pipeline efficiency, and storage utilization, will be crucial to your role. Collaboration is essential as you will work closely with data scientists, analysts, and other stakeholders to understand data needs and ensure that data infrastructure aligns with business objectives. Implementing data quality checks and governance processes to maintain data accuracy, completeness, and compliance with relevant regulations will also be part of your responsibilities. Furthermore, creating and maintaining comprehensive documentation for data pipelines, models, and systems will be necessary to ensure transparency and efficiency in data management processes.,
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration-Ab Initio Preferred Skills: Technology-Data Management - Data Integration-Ab Initio
Posted 3 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Talend Preferred Skills: Technology-Data Management - Data Integration-Talend
Posted 3 weeks ago
7.0 - 10.0 years
10 - 12 Lacs
Hyderabad
Work from Office
We are seeking a SQL Developer with 7+ years of experience in enterprise-level database management, and excellent communication skills This role involves handling large datasets (5-10M records) with a focus on performance tuning, deadlock management, and schema design for high-volume data systems Responsibilities include writing and optimizing SQL code, ETL transformations using SSIS, and managing database transactions The candidate will also work with MySQL, AWS Data Warehouse, and PostgreSQL, while ensuring scalability and efficiency for tera/petabyte-scale data Prior experience with data warehouses, conceptualizing schemas, and working with multi-client SAAS solutions is essential Expertise in tools like AWS QuickSight is preferred Knowledge of data science and statistics is a plus
Posted 3 weeks ago
12.0 - 14.0 years
25 - 30 Lacs
Chennai
Work from Office
The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.
Posted 3 weeks ago
3.0 - 5.0 years
6 - 9 Lacs
Chandigarh
Work from Office
Were looking for a hands-on ETL & BI Engineer to design, build, and maintain robust data pipelines on Google Cloud Platform and turn that trusted data into compelling, actionable reports in Power BI. Youll partner with data architects, analysts, and BI developers to ensure timely delivery of clean, well-modeled data into BigQuery—and translate it into high-impact dashboards and metrics. Key Responsibilities 1. Data Ingestion & Landing Architect and manage landing zones in Cloud Storage for raw feeds Handle batch and streaming input in Parquet, Avro, CSV, JSON, ORC 2. ETL Pipeline Development Develop and orchestrate ETL workflows with Cloud Data Fusion (including Wrangler) Perform data cleansing, imputation, type conversions, joins/unions, pivots 3. Data Modeling & Semantic Layer Design star- and snowflake-schema fact and dimension tables in BigQuery Define and document the semantic layer to support Power BI datasets 4. Load & Orchestration Load curated datasets into BigQuery zones (raw staging curated) Implement orchestration via scheduled queries, Cloud Composer/Airflow, or Terraform-driven pipelines 5. Performance, Quality & Monitoring Tune SQL queries and ETL jobs for throughput, cost-efficiency, and reliability Implement automated data-quality checks, logging, and alerting Required Skills & Experience Bachelor’s degree in Computer Science, Engineering, or related field 3+ years building ETL pipelines on GCP (Cloud Data Fusion, Cloud Storage, BigQuery) Solid SQL expertise, including query optimization in BigQuery Strong grasp of dimensional modeling (star/snowflake schemas) Experience managing Cloud Storage buckets and handling diverse file formats Familiarity with orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries) Excellent problem-solving skills, attention to detail, and collaborative mindset Preferred (Nice to Have) Experience with other GCP data services (Dataflow, Dataproc) Power BI skills: data modeling, report development, DAX calculations, and performance tuning. Python scripting for custom transformations or orchestration Understanding of CI/CD best practices (Git, Terraform, deployment pipelines) Knowledge of data governance frameworks and metadata management
Posted 3 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Hybrid
Hiring Now: Data Engineer / ETL Developer (Talend + Power BI) Location : Bengaluru | Experience : 13 Years | Immediate Joiners Preferred (15 days) About the Role We are looking for a Data Engineer / ETL Developer passionate about designing and developing robust ETL pipelines and data models using Talend Enterprise and delivering insights through Power BI . Key Responsibilities Analyze and understand business requirements to provide end-to-end BI solutions. Design and implement scalable ETL pipelines using Talend / Informatica . Integrate data from Oracle, MSSQL, file systems, FTP, REST APIs, and more. Develop data models, document solutions, and build data catalogs. Optimize ETL processes, troubleshoot issues, and lead team deliverables. Drive data-driven decision-making via reporting & analysis. Must-Have Skills 1+ years in Talend Enterprise/Open Studio , incl. tools like TAC, TMC, Talend API. Strong SQL, ETL pipeline development. Experience with Java or Python . Solid understanding of data modeling and database design . Good-to-Have Skills Experience with Power BI (dashboards, DAX, visuals). Exposure to Talend Data Catalog is a plus. Soft Skills Analytical thinker with strong communication skills. Self-motivated and result-driven. Comfortable working in cross-functional, multicultural teams.
Posted 3 weeks ago
4.0 - 8.0 years
17 - 25 Lacs
Gurugram
Work from Office
-Role & responsibilities := Build and enhance the core data platform , including ETL pipelines and warehouse layers Work with large volumes of structured and semi-structured data across PostgreSQL, MySQL, and MongoDB Design and maintain analytics-ready datasets to support operational, financial, and compliance reporting Write and optimize complex SQL queries to process billions of records efficiently Develop transformation logic and data workflows using Python or Groovy Ensure data quality, reliability, and performance across on-prem and AWS cloud environments Collaborate with engineering, analytics, and product teams to solve business problems using data Implement data validation, audit, and alerting mechanisms to maintain platform stability Drive data exploration to identify patterns, gaps, and opportunities for performance improvement (For Lead role) Provide technical direction and mentorship across the team, ensuring best practices in design and scalability Preferred candidate profile :- 4 - 8 years of experience in data engineering or backend systems Strong skills in SQL, Python/Groovy , and working with PostgreSQL, MySQL, MongoDB Hands-on with ETL pipelines , data warehousing , and AWS/cloud platforms Experience handling structured and semi-structured data Self-driven, flexible, and preferably from a startup or fast-paced environment For Lead role: prior team collaboration or mentoring experience is a plus
Posted 3 weeks ago
7.0 - 10.0 years
15 - 17 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
Greetings!!! This is in regards to a Job opportunity for ETL Developer with Datamatics Global Services Ltd. Position: ETL Developer Website: https://www.datamatics.com/ Job Location: Mumbai ****Contract for 3 months**** Job Description: 5 years experience Minimum 3 years of experience in Talend & Datastage development Expertise in designing and implementing Talend & Datastage ETL jobs Strong analytical and problem-solving skills Design, develop, and maintain Talend integration solutions Collaborate with business stakeholders and IT teams to gather requirements and recommend solutions Create and maintain technical documentation Perform unit testing and troubleshoot issues
Posted 3 weeks ago
8.0 - 13.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Role: Lead SQL Server Developer Location Bangalore (Ashok Nagar) Experience : 8 - 10 years of prior experience which includes 3+ years of Team Lead experience required. Education : Bachelor's/Masters Degree in Technology. Salary : Negotiable Job Type : Full Time (On Role) Mode of Work : Work from Office Job Description What you will be doing: Working on Microsoft SQL Server 2012 and above server-side development. Designing the schema that can be usable across multiple customers. Design and developing T-SQL (Transact SQL) Stored Procedures, Functions and Triggers and SSIS packages. Develop underlying data models and databases. Develop, manage and maintain data dictionary and or metadata. Doing performance tuning & query optimization. Ensuring compliance of standards and conventions in developing programs. Analyse and resolve complex issues without oversight from other people. Perform quality checks on reports and exports to ensure exceptional quality. Works in Scrum environment. Preferred Skills: Knowledge of tools like Qlik /Qlik sense and/or any other data visual tools. Understanding of .NET code/jQuery experience is a plus. Knowledge in Microsoft Reporting tool (SSRS). Experience in Database Administration activities. Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -
Posted 3 weeks ago
3.0 - 5.0 years
8 - 12 Lacs
Mumbai
Work from Office
Role Summary: Development of functions, stored procedures, and packages Development using external tables, bulk statement processing, dynamic statement execution, the use of bind variables, and the use of ref cursors, PL/SQL object types SQL statement tuning, reviewing explain plans, and utilizing optimizer hints Dealing with large volumes of data (millions/billions of rows of data) and dealing with partitioned tables Integration with ETL processes and experience in ETL tools Coding applications using best practices, documentation. Knowledge of Java/J2EE would be added advantage Would be responsible for unit testing. Contribute in design improvement and product enhancement. Demonstrate ability to understand unique requirements and implement them. He/ She should be a self-learner, able to work independently and manage tasks in hand. Skills: Excellent skills in relational database design Knowledge of Oracle, MSSQL, MySQL, Maria DB Extract Transform Load (ETL) concepts and technologies Data Warehousing tools, patterns and processes Knowledge of Scripting language (added advantage) Web servers: Apache Tomcat, JBoss, Weblogic and any additional web server Knowledge of Java/J2EE frameworks (added advantage)
Posted 3 weeks ago
3.0 - 8.0 years
0 - 1 Lacs
Chennai
Remote
Seeking an experienced Ab Initio trainer to deliver online training at CourseJet. Must have strong ETL and data warehousing expertise. Flexible schedule. Good remuneration. Apply now!
Posted 3 weeks ago
4.0 - 7.0 years
8 - 18 Lacs
Bengaluru
Remote
Role & Responsibilities: Must Have Skills: 1. Data Transformation & Correction: Proven experience executing complex data migrations, implementing data corrections, and performing large-scale data transformations with accuracy and efficiency 2. SQL Mastery: Over 5 years of hands-on experience writing advanced, high-performance T-SQL across diverse platforms, including Microsoft SQL Server. 3. ETL/ELT Development: Demonstrated expertise in architecting, developing, and maintaining robust, scalable ETL/ELT pipelines in enterprise-grade environments. 4. Scripting & Workflow Orchestration: Proficient in scripting languages such as Python, and with practical knowledge of orchestration frameworks like Apache Airflow. 5. CI/CD & Version Control: Deep understanding of Git-based workflows and best practices, with experience building and managing automated CI/CD pipelines for database deployments. 6. Customer Engagement: Adept at working directly with clients to gather requirements, communicate technical solutions clearly, and ensure timely project delivery. Work Timings: 2 PM - 11 PM India Standard Time (IST)
Posted 4 weeks ago
4.0 - 6.0 years
7 - 15 Lacs
Noida, Bhubaneswar, Greater Noida
Work from Office
4–6 yrs in data migration, ETL, integration SQL, Python/Shell scripting Exp in GCP, AWS, Azure RDBMS & NoSQL (PostgreSQL, MongoDB, etc.) ETL tools (Talend, NiFi, Glue) Airflow, Kafka, DBT, Dataflow Strong problem-solving & communication skills
Posted 4 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Informatica Preferred Skills: Technology-Data Management - Data Integration-Informatica
Posted 4 weeks ago
5.0 - 9.0 years
15 - 27 Lacs
Hyderabad/ Secunderabad, Pune
Hybrid
Software Engineer (COE Retail - India) Job Description: Hands-on technical role to support US, India and Canada based engineering and audit teams through client data conversion and the development, delivery and support of audit tools, audit data and reports. Ideal candidate combines experience with Microsoft Access application development (VBA), project management, handling of very large datasets (preferably SQL server) and business understanding to mine data and automate workflows focused on identifying anomalies and errors in our clients business transactions whilst possessing excellent written and verbal communication skills. Essential Functions: Perform complex data mining, forensic analysis, and aggregation; commingle multiple datasets to develop a streamlined, efficient representation of transaction lifecycles. Data mining for advanced analysis of data; presented in reports and/or interactive auditing tools. Rapidly implement technical solutions designed to maximize operational efficiency and productivity. Perform Data validation to ensure accuracy, completeness, and quality of data. Perform large volume data conversion, data cleansing, production report generation, and upholding scheduled data delivery standards. Perform full life cycle project ownership from analysis to development to delivery of audit solutions. Contribute to documentation initiatives as needed. Contribute to group knowledge and best practices. Interacts with Auditors and other end users, other analysts, management, and corporate departments. Key Responsibilities: Strong analytical/problem solving skills, Strong communication skills, ability to translate end user needs into an IT solution, Excellent organizational skills and attention to detail is critical to the success of all candidates Ability to recognize inefficiencies in processes (operational or technical) and the ability to design solutions to address these issues. Mastery working with large datasets Solutions are imaginative, thorough, and practicable. Contributes to the development of new and innovative solution Demonstrates self-reliance by (on occasion) working without appreciable direction towards long-range goals and objectives. Assignments may be self-initiated and agreed upon with management Contributes to the productivity of the organization through the completion of projects associated with departmental objectives May be a primary point of contact with other engineers, vendors and consultants engaged in supporting the work activities of the group Communicates formally and informally to management. May also conduct briefings and participate in meetings with internal and external partners or collaborators Owns communication between client service delivery teams and analysts Owns performance and quality of applications and data for client support systems Requirements: BTech / MCA / Equivalent Training / Certification and 5+ years of relevant experience Experience developing and/or maintaining applications in Advanced SQL Server skills SQL Query Development and Performance Tuning (DBA Experience also helpful). Any programming experience (especially Microsoft VB / VBA) would be an added advantage Extensive experience with relational databases RDBMS (Microsoft, Oracle, MySQL, PostgreSQL) Experience with scripting languages (VBScript, PowerShell) preferred Experience working with ETL processes Any .Net C#, Java & VB.Net Web &/or Windows Development would be an advantage Excellent verbal and written communication skills Skilled in Agile and SCRUM development methodologies (Kanban) Lives problem solving and troubleshooting Comfort in working with team members that are remote and located in the US, India, or other geographies Experience working across multiple clients on multiple projects simultaneously Stakeholder Management - collaborates, takes ownership & accountability Ability to work within a matrix organization Strong organizational skills and adaptive capacity for rapidly changing priorities and workloads Experience in project management is a plus
Posted 4 weeks ago
5.0 - 7.0 years
15 - 20 Lacs
Thiruvananthapuram
Work from Office
Role Proficiency: JD for SAP BODS Data Engineer Strong proficiency in designing, developing, and implementing robust ETL solutions using SAP Business Objects Data Services (BODS). with strong EDW experience Strong proficiency in SAP BODS development, including job design, data flow creation, scripting, and debugging. Design and develop ETL processes using SAP BODS to extract, transform, and load data from various sources. Create and maintain data integration workflows, ensuring optimal performance and scalability. Solid understanding of data integration, ETL concepts, and data warehousing principles. Proficiency in SQL for data querying and manipulation. Familiarity with data modeling concepts and database systems. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills for effective collaboration. Ability to work independently and manage multiple tasks simultaneously. 3+ experience relevant ETL development (SAPBODS) Required Skills Data Warehousing,Sap Bods,Etl,Edw
Posted 1 month ago
7.0 - 12.0 years
10 - 20 Lacs
Pune
Hybrid
Pune I 1. Python, Pyspark 2. Strong SQL coding skill (must have) Exp 7+ yrs, Data Engineer (ETL developer) - Python pyspark , SQL, Vertica, Oracle, Python, Pyspark; Good to have: AWS Data Engineer (ETL developer) - Python pyspark , SQL, Vertica, Oracle, Python, Pyspark
Posted 1 month ago
6.0 - 10.0 years
15 - 30 Lacs
Kolkata, Mumbai (All Areas)
Work from Office
Experience-6 to 10 Years Job Locations-Kolkata & Mumbai Notice Period-30 Days Job Role-ETL Lead ETL Lead with strong AWS expertise AWS Glue, Lambda, RDS (e.g. MySQL) Primary Responsibilities: • 6 to 9 years of experience in data engineering or ETL development. • Proven expertise in AWS Glue, Lambda, S3, and RDS (MySQL) for ETL workflows. • Strong SQL and Python/PySpark development skills. • Solid understanding of data warehousing concepts and data modeling (star/snowflake schemas). • Experience delivering data solutions consumed by Power BI dashboards. • Ability to lead and manage a small team of developers. • Understanding of data modeling concepts and dimensional models (star/snowflake schema). • Collaborate with business analysts and Power BI developers to define data requirements and ensure datasets are optimized for reporting performance. • Monitor, troubleshoot, and optimize ETL job execution and resource usage. • Create and maintain technical documentation of data flows, schema definitions, and transformation logic. • Support data refresh schedules and coordinate with teams for data availability in Power BI. • Excellent communication and stakeholder management skills.
Posted 1 month ago
4.0 - 6.0 years
7 - 17 Lacs
Hyderabad
Work from Office
About this role: Wells Fargo is seeking a Senior Software Engineer. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow. In this role, you will: Lead moderately complex initiatives and deliverables within technical domain environments Contribute to large scale planning of strategies Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals Lead projects and act as an escalation point, provide guidance and direction to less experienced staff Required Qualifications: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training or education. Desired Qualifications: Analyze, code, test, debug, and document enhancements to data warehouse applications. Responsible for the implementation of data flows to connect information security data sources for analytics and business intelligence systems Work within an agile team model to deliver application enhancement releases. Self-assign stories from a product backlog and collaborate with the Scrum Master Kanban Lead, Product Owners, Engineers, and User Acceptance Testers to deliver on business stories. Relevant and Good experience in relational databases, querying, data warehousing, ETL process, requirements gathering and/or decision support tools. Hands-on experience in SQL especially in SQL Server & Teradata environment Hands on experience in ETL development (Ab Initio) Work with a variety of data ingestion patterns (NDM files, direct DB connections and APIs) to receive data in the warehouse. Collaborate with peers, colleagues, and managers to resolve technical challenges and achieve goals. Ability to provide adequate support for resolution of production issues. 3+ years of ETL Development experience (Ab Initio) 2+ years of Teradata experience 2+ years of experience with SQL Experience designing and optimizing complex SQL and/or SAS queries Experience working in operational reporting Experience with Agile Scrum (Daily Standup, Planning and Retrospective meetings) and Kanban 2+ years of experience with modern software engineering technologies and tool sets 3+ years of experience in ETL Development
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough