Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 7 years
8 - 12 Lacs
Noida
Work from Office
JOB SUMMARY DESCRIPTION / PRIMARY PURPOSE OF JOB The Data Warehouse Developer participates in the design, development, documentation, and maintenance of data movement, transformation, integration, and analytics solutions. This role works across various platforms and tools in both cloud and on-prem environments. This position will provide 24x7 support for Starkey global facilities as well as HQ. JOB RESPONSIBILITIES/RESULTS Analyze customer requirements and translate them into technical requirements. Code, test, debug, document and implement ETL solutions. Manage delivery to both the functional and non-functional requirements, including performance, scalability, availability, reliability and security. Design and develop solutions that transform and integrate struc tured and semi-structured data across various platforms, including cloud (SaaS/PaaS/IaaS) and on-premises sources and targets. Participate in all Agile ceremonies including the daily SCRUM. Collaborate with infrastructure team, DBAs, and other software developers to assist in resolving problems with software products or company software systems. Work with other technical engineers across IT to develop ETL solutions, and follow corporate standards for databases, data engineering, and analytics. Provide accurate estimates as well as clearly communicating the status of tasks and identification of risks. Complete all phases of SDLC including analysis, design, development, testing and support utilizing Scrum/Agile methodologies. Demonstration of self-driven, highly motivated, and excellent communication skills. Other duties/responsibilities as assigned. JOB REQUIREMENTS Minimum Education, Certification and Experience Requirements Education - 4 year degree or equivalent experience Experience - 4+ years in software development and/or data engineering roles, preferably 3+ years focused on data engineering. Knowledge / Technical Requirements Expertise in data transformations across disparate platforms Expertise in data manipulation with both structured and semi-structured data Expertise in designing and populating data stores for various purposes, including analytics. Expertise in both traditional and newer data architectures (e.g., data lake) Hands-on knowledge of cloud-native data tooling, ideally in Azure, GCP is a bonus. Hands-on knowledge of traditional ETL tooling, such as from Microsoft - SSIS, SSRS, SSAS, and PowerBI Knowledge of Azure Data Factory is a plus. Competencies, Skills & Abilities Ability to create clear presentations and documentation. Self-motivating Build Partnerships with Customers Demonstrate Credibility
Posted 2 months ago
5 - 8 years
14 - 20 Lacs
Kolkata
Work from Office
JD for ETL Developer in SAS Location : Kolkata Working hours : As per banking hours Experience : Min 5+Years Responsibilities: Design, develop, and implement ETL solutions using industry best practices and tools, with a strong focus on SAS. Develop and maintain SAS programs for data extraction, transformation, and loading. Work with source system owners and data analysts to understand data requirements and translate them into ETL specifications. Build and maintain data pipelines for Banking database to ensure data quality, integrity, and consistency. Perform data profiling, data cleansing, and data validation to ensure accuracy and reliability of data. Troubleshoot and resolve Banks ETL-related issues, including data quality problems and performance bottlenecks. Qualifications : Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as an ETL Developer, preferably within the banking or financial services industry. Strong proficiency in SAS programming for data manipulation and ETL processes. Experience with other ETL tools (e.g., Informatica PowerCenter, DataStage, Talend) is a plus. Solid understanding of data warehousing concepts, including dimensional modeling (star schema, snowflake schema). Experience working with relational databases (e.g., Oracle, SQL Server) and SQL. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. Understanding of regulatory requirements in the banking sector (e.g., RBI guidelines) is an advantage. Preferred Skills: Experience with cloud-based data warehousing solutions (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Knowledge of big data technologies (e.g., Hadoop, Spark). Experience with agile development methodologies. Relevant certifications (e.g., SAS Certified Professional). Drop your resume on nikita.chaudhary@enlinkit.com or Whatsapp on 8879637539
Posted 2 months ago
5 - 7 years
15 - 18 Lacs
Hyderabad
Work from Office
Job Title : Spark & Delta Lake Developer Location : Hyderabad Experience : 3-7 years Job Description : We are seeking an experienced Spark & Delta Lake Developer to join our dynamic team. The ideal candidate will have a deep understanding of Spark core concepts, expertise in Delta Lake , and experience designing and implementing efficient data pipelines . Key Responsibilities : Understand and work with Spark core concepts like RDDs , DataFrames , DataSets , SparkSQL , and Spark Streaming . Apply Spark optimization techniques to enhance performance. Leverage Delta Lake features including time travel , schema evolution , and data partitioning . Design and implement robust data pipelines using Spark and Delta Lake as the data storage layer. Develop solutions using Python , Scala , or Java , and integrate them with the ETL process. Perform data ingestion from various sources including flat files , CSV , APIs , and databases . Ensure data quality through best practices and data validation techniques. Required Skills & Experience : Strong understanding of data warehouse concepts and data modeling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies . Experience with DataStage , Prophecy , Informatica , or Ab Initio is a plus. Qualifications : Bachelor's/Masters degree in Computer Science , Engineering , or a related field. Proficiency in Python , Scala , or Java . Hands-on experience with Spark and Delta Lake . If you're passionate about building scalable data solutions and have the required expertise, apply now! Interested candidates can apply by submitting their resumes via Naukri or share your resume on kamna@prointegrate.net Or can call directly on 6362744117 Kamna (HR)
Posted 2 months ago
7 - 12 years
18 - 30 Lacs
Ahmedabad
Hybrid
Position Summary We are looking for an experienced Business Intelligence (BI) Developer to join our team! The BI Developer a highly motivated and experienced professional with a wide knowledge of development, building and support of Business Intelligence, Data Warehouse, OLAP, Reporting, Data Mining and Data integration solutions. In this hand on role, the BI Developer will be responsible to build and load enterprise information by using data warehouses and marts, design cube models and build dashboard and reports. Duties of the BI Developer Tasks of the BI Developer may include but not limited to the following: Maintain, support, and enhance the business intelligence data backend, including data warehouses. Map configurations and complex data architectures, ensuring documentation meets current and forecasted needs. Collaborate and work with end users to ensure that data and reports meet their business needs. Design and Implement of data storage and governance systems. Design and Monitor end-user reporting tools and systems, ensuring reports are accurate and up to date. Coordinate with Business Managers to translate business requirements into coherent Business Intelligence (BI) reports. Generate BI reports, dashboards and data models with the help of BI tools and MS-Office products. Manage upgrades, modifications, maintenance and troubleshooting of BI systems. Design, develop, test and implement data warehouse systems for BI reporting and analysis. Prepare and maintain BI reporting and analysis documentations. Identify opportunities to improve processes and strategies with technology solutions Identify development needs in order to improve and streamline operations Provide recommendations to enhance BI system performance and capacity Experience and Skills Requirements Minimum of 5 years experience working with data warehousing implementations and dimensional modelling Solid understanding of the data warehousing end-to-end development life cycle Strong SQL programming skills and solid understanding of relational databases and data architecture SQL Server skills including SQL objects development, performance tuning, data analysis and integration Practical experience with ETL development and associated tools Experience with enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools Ability to read code and support applications, reports and processes Experience with Agile sprints, SQL, and HTML Experience with at Key BI Market tools including MS Power BI Excellent analytical and problem-solving skills Excellent communication, interpersonal and presentation skill Proven abilities to take initiative and be innovative Education and Training Requirements The minimum post-secondary education requirement is a bachelors degree with concentrated study in Computer Science, Engineering, or other related disciplines. Related BI Training or certification would be an asset Person Specification You are a self-starter and a quick learner. You can master multiple technologies to be effective in a complex integrated environment of multiple sub systems and technologies. You are also a seasoned programmer who is not afraid to experiment to come up with the best programming solution or optimization strategy. You possess an open mindset when it comes to new design and always welcome ideas and constructive critiques from your peers and superiors. https://www.linkedin.com/in/nirav-saujani/ #BIDeveloper #Datascientist #Remotejob #sql
Posted 3 months ago
4 - 6 years
6 - 8 Lacs
Gurgaon
Work from Office
BI Specialist, OBIEE Developer BeamstacksMG Road, Gurgaon. The Business Intelligence (BI) Specialist is responsible for the design, development, implementation, management and support of missioncritical enterprise BI reporting and Extract, Transform, Load (ETL) processes and environments. Exposure to one or more implementations using OBIEE Development and Administration. Must have 6 Years Development experience in PL/SQL. Experience in developing OBIEE Repository at three layers (Physical, Business model and Presentation Layers), Interactive Dashboards and drill down capabilities using global and Filters and Security Setups. Must have 3 year of experience in Data Modeling, ETL Development (Preferably OWB), Etl and BI Tools installation and configuration & Oracle APEX. Experience in developing OBIEE Analytics Interactive Dashboards with Drill-down capabilities using global and local Filters, OBIEE Security setup (users/ group,access/ query privileges), configuring OBIEE Analytics Metadata objects (Subject Area, Table, Column), Presentation Services/ Web Catalog objects (Dashboards,Pages, Folders, Reports). Hands on development experience on OBIEE (version 11g or higher), Data Modelling. Experience in installing and configuring Oracle OBIEE in multiple life cycle environments. Experience creating system architecture design documentation. Experience presenting system architectures to management and technical stakeholders. Technical and Functional Understanding of Oracle OBIEE Technologies. Good knowledge of OBIEE Admin, best practices, DWBI implementation challenges. Understanding and knowledge of Data warehouse. Must have OBIEE Certification on version 11g or higher. Experience with ETL tools. Experience on HP Vertica. Domain knowledge on Supply Chain, Retail, Manufacturing. Developing architectural solutions utilizing OBIEE. Work with project management to provide effort estimates and timelines. Interact with Business and IT team members to move the project forward on a daily basis. Lead the development of OBIEE dashboard and reports. Work with Internal stakeholder and development teams during project lifecycle
Posted 3 months ago
4 - 7 years
6 - 9 Lacs
Kolkata
Work from Office
Job Title :ETL Developer / Data Engineer. Location :Remote (India, Ahmedabad Preferred). Shift :UK Shift. We are looking for an experienced ETL Developer / Data Engineer with 6+ years of experience in Informatica PowerCenter and DBT to design, develop, and optimize data pipelines for large-scale data processing and analytics. Key Responsibilities. Develop, maintain, and optimize ETL workflows using Informatica PowerCenter. Implement data transformations and modeling using DBT. Write and optimize SQL queries for data extraction and transformation. Ensure data quality, validation, and performance tuning of ETL processes. Collaborate with cross-functional teams to understand data requirements and provide solutions. Monitor, troubleshoot, and enhance ETL performance for efficiency and :. 6+ years of experience in ETL development with Informatica PowerCenter. Hands-on experience with DBT for data transformation. Strong expertise in SQL and query optimization. Experience in data modeling and ETL performance tuning. Ability to work in a UK shift and collaborate with global teams. Job Type:Full-time. Schedule. UK shift :. Education. Bachelor's (Required). Experience. Informatica PowerCenter :4 years (Required). DBT (Data Build Tool) :3 years (Required). Work Location :Remote. (ref:hirist.tech). Show more Show less
Posted 3 months ago
5 - 10 years
8 - 14 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Job Responsibilities : 1. ETL Development and Data Processing : - Design, develop, and implement ETL processes to extract, transform, and load data from various sources into data warehouses or analytics systems. - Maintain and enhance ETL pipelines to ensure data quality, integrity, and consistency. 2. DevOps and CI/CD Pipeline Management : - Implement and support CI/CD pipelines for Enterprise Data Warehouse (EDW) code development and deployment using GitLab. - Manage and configure workflows for code repositories, ensuring seamless code integration and promotion across environments. 3. Data Analysis and Reporting : - Apply statistical analysis and data mining techniques to uncover trends, patterns, and insights within large datasets. - Develop and maintain dashboards, reports, and visualizations to communicate key insights to stakeholders. 4. Programming and Scripting : - Develop and maintain Linux shell scripting, Python, and PL/SQL code for data extraction, transformation, and automation tasks. - Create efficient scripts for data replication, migration, and data integration between systems. 5. Quality Assurance and System Optimization : - Perform quality assurance checks on system installations, configurations, and code to ensure performance and scalability. - Recommend improvements to optimize performance and reliability of data processes and systems. 6. Business Understanding and Translation : - Work closely with business stakeholders to understand their goals, processes, and requirements, translating them into technical data solutions. - Ensure alignment between technical solutions and business objectives to drive strategic decision-making. 7. Data Governance and Security : - Implement data governance best practices to ensure the security, integrity, and privacy of enterprise data. - Adhere to data security regulations and policies, ensuring that data is handled with confidentiality and compliance. 8. Collaboration and Stakeholder Communication : - Collaborate with cross-functional teams, including data analysts, engineers, and business teams, to deliver impactful data solutions. - Present data findings and actionable insights to stakeholders at various organizational levels, both written and verbally. Required Qualifications : Experience : - 8+ years of experience in IT with a focus on Business Intelligence, Data Warehousing, and Analytics systems. - Proven experience with DevOps methodologies, CI/CD pipelines, and GitLab workflows. - Proficient in ETL development and creating data workflows in SQL, PL/SQL, Python, and Linux shell scripting. Technical Skills : - Strong knowledge of open-source technologies, Linux, and GitLab for codebase management. - Experience in data replication and data migration processes. - Hands-on experience with Python and Node.js for programming and automation tasks. Soft Skills : - Excellent problem-solving and analytical abilities. - Strong communication skills with the ability to convey complex technical concepts to non-technical stakeholders. - Ability to manage multiple projects in a dynamic and fast-paced environment. Preferred Skills : - Experience with data governance principles and ensuring data security compliance. - Familiarity with data quality assurance and optimization techniques. - Understanding of modern data platforms such as cloud-based data warehouses and data lakes. Location - Hyderabad,Ahmedabad,pune,chennai,kolkata.
Posted 3 months ago
2 - 7 years
4 - 9 Lacs
Mumbai
Work from Office
Overall Responsibilities: Perform business/data analysis to investigate various business problems and propose solutions, working closely with clients and the team. Analyze report requirements, perform gap analysis, understand existing systems and data flows, model required changes, source, enrich, and publish data to Axiom for final reporting. Conduct detailed requirements analysis with end-users to understand business (reporting) needs. Optimize performance and scalability to support large-scale deployments. Define new features in conjunction with product management and provide specifications. Ensure quality and completeness of the final product through unit testing, documentation, and maintenance as appropriate. Technical Skills: Required Skills: 4+ years experience in database development (SQL/Data Modeling/Stored Procedures). 2+ years of experience in ETL Programming/Data Engineering (tools like Informatica not necessary). Experience with job scheduler tools (e.g., Autosys). Proficiency in UNIX. Preferred Skills: Experience with Axiom CV10. Understanding of U.S. risk and regulatory requirements. Familiarity with DevOps tooling. Experience with Java/Scala/Python and Spark Framework. Exposure to the BFSI (Banking, Financial Services, and Insurance) and finance industry. Experience: Total and relevant experience:5 to 7 years. Proven experience in database development and ETL programming/data engineering. Demonstrated ability to work with job scheduler tools and UNIX. Day-to-Day Activities: Perform business/data analysis to investigate business problems and propose solutions. Analyze report requirements and perform gap analysis. Model required changes and manage data flows for reporting. Optimize performance and scalability for large-scale deployments. Define new features and provide specifications with product management. Ensure quality through unit testing, documentation, and maintenance. Collaborate with clients and team members to understand business needs. Work with new technologies and apply them towards enterprise-level data solutions. Handle multiple tasks and work against deadlines/priorities. Qualification: Bachelor's degree in Computer Science, Information Technology, or a related field. Strong problem-solving and analytical skills. Excellent communication and interpersonal abilities. High interest in Finance business and data analysis; interest in Accounting preferred. Ability to work effectively in a global team and team development environment. Eager to learn and apply new technologies. Soft Skills: High interest in Finance business and data analysis; interest in Accounting preferred. Ownership/accountability and desire to take on an expanded role over time. Ability to multi-task and work against deadlines/priorities. Eager to work with new technologies and apply them towards enterprise-level data solutions. Capable of working with a global team and thriving in a team development environment. Advanced problem-solving skills and the ability to deal with real-world business issues. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 3 months ago
4 - 8 years
6 - 10 Lacs
Bengaluru
Work from Office
About The Role :: Synechron is seeking a talented ETL Developer to join our innovative team. The ideal candidate will be passionate about data transformation and possess strong technical skills in ETL tools, relational and NoSQL databases, data warehousing solutions, and programming languages. This role requires a fresher with a strong understanding of data processing and manipulation, and the ability to work in a dynamic, fast-paced environment. Overall Responsibilities: Develop, maintain, and optimize ETL processes using Informatica PowerCenter and other ETL tools. Design and implement data warehousing solutions, including architectures like Kimball and Inmon. Work with relational databases (MySQL, Oracle, SQL Server) and NoSQL databases (MongoDB, Cassandra). Utilize programming languages and scripting for data manipulation and transformation. Ensure data accuracy, consistency, and integrity throughout ETL processes. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Stay updated with the latest technological advancements and industry trends. Technical Skills: Primary Skills: Experience with ETL tools like Informatica PowerCenter. Strong understanding of relational databases (MySQL, Oracle, SQL Server) and SQL. Familiarity with data warehousing architectures (Kimball, Inmon) and technologies (Amazon Redshift, Google Big Query, Snowflake). Proficiency in scripting and programming languages for data manipulation and transformation. Secondary Skills: Knowledge of NoSQL databases (MongoDB, Cassandra). Understanding of big data technologies for large-scale data processing. Experience: Total & Relevant Experience:4+ years Day-to-Day Activities: Develop and maintain ETL processes to support data integration and transformation. Implement and optimize data warehousing solutions. Work with relational and NoSQL databases to ensure data accuracy and integrity. Collaborate with cross-functional teams to gather data requirements and deliver solutions. Use programming languages and scripting for data manipulation. Stay up-to-date with the latest technological advancements and industry trends. Qualifications: Any Graduate or Postgraduate. Relevant certifications in ETL tools and data warehousing (preferred). Soft Skills: Strong communication skills, with proficiency in English. Excellent problem-solving and critical thinking skills. Ability to work independently and collaboratively in a team environment. Detail-oriented with a focus on data accuracy and integrity. Eager to learn and apply new programming concepts and technologies. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 3 months ago
2 - 4 years
10 - 20 Lacs
Mumbai
Hybrid
As a Deputy Manager in the Data Engineering department, you will be responsible for managing and organizing data structures, designing and implementing scalable ETL processes, and ensuring high data quality and integrity. You will work closely with various teams to understand and fulfill their data requirements, and contribute to the overall data strategy of the organization.
Posted 3 months ago
11 - 14 years
37 - 40 Lacs
Bengaluru
Work from Office
Responsibilities: Design+ develop+ and implement ETL processes using Ab Initio software. Collaborate with business stakeholders and technical teams to gather requirements and define data integration solutions. Develop and maintain technical documentation+ including design specifications+ data mappings+ and process flows. Perform data analysis and profiling to ensure data quality and consistency. Troubleshoot and debug ETL processes+ identifying and resolving issues in a timely manner. Optimize performance of ETL jobs to meet SLAs and performance requirements. Provide technical support and assistance to users+ resolving issues and answering questions related to Ab Initio solutions. Stay up-to-date with industry trends and best practices in data integration and ETL development. Skills and Qualifications: 12-18 years of experience in designing and developing ETL solutions using Ab Initio software. Strong understanding of data integration concepts and methodologies. Proficiency in SQL+ Unix/Linux+ and shell scripting. Excellent analytical and problem-solving skills. Strong communication and collaboration skills+ with the ability to work effectively in a team environment. Experience with other data integration tools and technologies is a plus. Ab Initio certification is desirable. Band: P1 Competency: Data & Analytics
Posted 3 months ago
11 - 14 years
35 - 40 Lacs
Pune, Bengaluru
Work from Office
- Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
4 - 5 years
10 - 15 Lacs
Navi Mumbai, Bhopal, Gurgaon
Work from Office
Role & responsibilities : Design and maintain ETL workflows for high-volume SMS traffic. Build real-time and batch processing pipelines using Kafka/Spark/Flink/Hadoop. Optimize SQL/NoSQL databases for high-speed data retrieval. Implement data encryption, masking, and access control for security and compliance. Automate data ingestion and processing using Python, Airflow, or Shell scripting. ETL tools: Design and maintain ETL using Python / Apache NiFi / Airflow / Talend / AWS Glue or any orchestration tool. SQL query tuning and database performance optimization. OLAP & OLTP systems, ensuring efficient data lakes and data warehousing. Big data & streaming: Kafka / Spark / Flink / Hadoop or similar Databases: MySQL / PostgreSQL / MongoDB / Cassandra, Redis or similar Programming: Python, SQL, Scala, Java. Education: Education level: B.Tech / M.Tech / MCA Experience: 04 - 05 relevant years Preferred Qualifications: Experience in cloud-based data services (AWS, GCP, Azure). Knowledge of AI-driven data pipelines & real-time fraud detection. Certifications in Data Engineering (Google, AWS, etc.). This will be 100% Work from Office
Posted 3 months ago
3 - 6 years
8 - 18 Lacs
Pune
Hybrid
We are seeking a highly skilled Senior Engineer with expertise in ETL processes, SQL, GCP (Google Cloud Platform), and Python. As a Senior Engineer, you will play a key role in the design, development, and optimization of data pipelines and workflows that drive business insights and analytics. You will collaborate closely with cross-functional teams to ensure data systems are scalable, robust, and efficient. Key Responsibilities : Design & Develop ETL Pipelines : Build, maintain, and optimize scalable ETL workflows to ingest, transform, and load large datasets using best practices. Database Management : Write efficient, optimized SQL queries to extract, manipulate, and aggregate data from relational databases. Design and implement database schemas for optimal performance. Cloud Infrastructure : Utilize Google Cloud Platform (GCP) services, such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage, to develop and manage cloud-based data solutions. Automation & Scripting : Use Python to automate processes, build custom data transformation logic, and integrate with various data systems and services. Performance Tuning : Ensure the performance of data pipelines and queries are optimized for speed and cost. Troubleshoot issues, implement best practices, and improve system performance. Collaboration & Mentorship : Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements. Provide mentorship and guidance to junior engineers. Data Quality & Governance : Ensure high-quality data through validation, error handling, logging, and monitoring of ETL processes. Implement data governance best practices to maintain consistency and integrity. Documentation & Reporting : Document data pipeline designs, coding standards, and best practices. Create reports for stakeholders to provide insights into data processing activities. Required Skills and Qualifications : ETL Expertise : Strong experience in building, deploying, and optimizing ETL processes with tools like Apache Airflow, Dataflow, or custom Python scripts. SQL Proficiency : Advanced SQL skills with experience in writing complex queries, optimizing performance, and working with large datasets. Cloud Platform (GCP) : Deep understanding and hands-on experience with Google Cloud Platform, specifically BigQuery, Cloud Storage, Pub/Sub, Dataflow, and other data-related services. Python : Proficient in Python, especially for data manipulation, ETL automation, and integration with cloud-based solutions. Data Engineering Best Practices : Familiarity with modern data engineering frameworks, version control, CI/CD pipelines, and agile methodologies. Problem Solving : Strong analytical and troubleshooting skills with a focus on identifying solutions to complex data engineering challenges. Communication : Excellent communication skills, able to work effectively in a team and engage with non-technical stakeholders. Preferred Qualifications : Experience with other cloud platforms such as AWS or Azure. Knowledge of data lake and data warehouse architectures. Familiarity with containerization (Docker, Kubernetes) and orchestration tools. Understanding of data privacy, security, and compliance best practices. Education & Experience : Bachelors degree in Computer Science, Engineering, or a related field (Master’s preferred). 5+ years of experience in data engineering or a related technical field. Proven experience in designing and implementing data solutions in a cloud environment. Role & responsibilities
Posted 3 months ago
6 - 11 years
12 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Dear Candidate, Greeting for the day! Job description Responsible for delivering highly available heterogeneous database servers on multiple technology platforms. Strong in SQL Knowledge in Python with ETL. Lead all database maintenance and tuning activities, ensuring continued availability, performance and capacity of all database services across every business application and system. Is expected to consider current practices to develop innovative and reliable solutions that will continuously improve the service quality to the business. Create/Update/Maintain data warehouse for business/product reporting needs. Refine physical database design to meet system performance requirements. Identify inefficiencies in current databases and investigate solutions. Diagnose and resolve database access and performance issues. Develop, implement, and maintain change control and testing processes for modifications to databases. Ensure all database systems meet business and performance requirements. Coordinate and work with other technical staff to develop relational databases and data warehouses. Advocates and implements standards, best practices, technical troubleshooting processes, and quality assurance Develop and maintain database documentation, including data standards, procedures, and definitions for the data dictionary. Produce ad-hoc queries and develop reports to support business needs. Creation and maintenance of technical documentation. Perform other management assigned tasks as required. Qualifications Must Haves Bachelors or Masters degree in Computer Science, Mathematics or other STEM discipline 3+ years of experience in working with relational databases. (e.g. Redshift, PostgreSQL, Oracle, MySQL) 2+ years of experience with NoSQL database solutions (e.g. MongoDB, DynamoDB, Hadoop/HBase etc.) 3+ years of experience with ETL/ELT tools (e.g. Talend/ Informatica, AWS Data Pipeline. Preferably on Talend.) Strong Knowledge on Data warehousing Basics and relational database management Systems and Dimensional modelling (Star schema and Snowflake schema) Configuration of ETL ecosystems and perform regular data maintenance activities such as data loads, data fixes, schema updates, database copies, etc. Experienced in data cleansing, enterprise data architecting, data quality and data governance Good understanding of Redshift Database design using distribution style, sorting, encoding features Working experience with cloud computing technologies AWS EC2, RDS, Data Migration Service (DMS), Schema Conversion Tool (SCT), AWS Glue Well versed in advanced query development and design using SQL, PL/SQL, Query Optimization, performance and tuning of applications on various databases Supporting multiple DBMS platforms in Production/QA/UAT/DEV in both on premise and AWS cloud environments Strong Pluses Experience with database partitioning strategies on various databases (PostgreSQL, Oracle) Experience in migrating, automating and supporting a variety of AWS hosted (Both RDBMS & NoSQL) databases in RDS, EC2 using CFT Experience with Big Data Technology Stack: Hadoop/Spark/Hive/MapR/Storm/Pig/Oozie/Kafka, etc. Experience with shell scripting for process automation Experience with source code versioning with Git and Stash Ability to work across multiple projects simultaneously Strong experience in all aspects of the software lifecycle including design, testing, and delivery Ability to understand and start projects quickly Ability and willingness to work with teams located in multiple time zones Regards, Sushma A
Posted 3 months ago
6 - 11 years
6 - 16 Lacs
Hyderabad
Hybrid
We are hiring an ETL Engineer for InsightSoftware International Pvt. Ltd. (Product Development) in Hyderabad . The role requires: ETL Tool (any) SQL Query Writing Data Migration Azure Cloud(Added Advantage) Strong Communication Skills Hybrid Work Model Work from office 2 days a week Shifts: 2:30 PM – 11:30 PM (3 days), 5:30 PM – 2:30 AM (2 days) If interested, please share your resume along with: Current CTC Expected CTC Notice Period Looking forward to connecting!
Posted 3 months ago
5 - 10 years
15 - 30 Lacs
Chennai, Pune, Bengaluru
Work from Office
We are excited to announce that we are hiring for hashtag Ab Initio Developer with a strong focus on Ab Initio, Oracle, and Unix skills. If you have a passion for data and a proven track record in ETL processes, we want to hear from you! Position: Ab Initio Developer Key Responsibilities: Utilize extensive hands-on experience with the Ab Initio ETL tool, including ICFF, PDL, Meta Programming, Conduct>IT, and Control Center. Understand and translate source-to-target ETL mappings effectively. Conduct thorough data investigations and analyses to ensure data integrity. Communicate functional and technical details clearly to stakeholders. Validate UAT results and identify issues proactively. Qualifications: Experience: 4 to 12 years in Ab Initio ETL processes. Skills: Proficiency in SQL writing is essential. Industry Knowledge: Experience in the Banking/Finance sector is a significant plus. Communication: Excellent articulation skills are required to explain complex concepts clearly.
Posted 3 months ago
5 - 10 years
5 - 15 Lacs
Noida
Work from Office
Key Responsibilities: Develop and manage ETL workflows using SQL and SSIS. Design data models and optimize data warehouse structures. Perform data extraction, transformation, and loading across various systems. Ensure data quality, consistency, and performance of ETL processes. Collaborate with business teams to understand data needs and requirements. Required Skills: Strong experience with SQL and SSIS. Proficiency in data warehousing and data modeling. Familiarity with cloud-based data systems (e.g., Azure, AWS, GCP) is a plus. Knowledge of data integration and data quality techniques.
Posted 3 months ago
2 - 7 years
4 - 9 Lacs
Chennai, Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : UNIX Shell Scripting, Informatica PowerCenter Minimum 2 year(s) of experience is required Educational Qualification : 15 years of Full Time education br/>Key Responsibilities :A:Data warehouse ETL Design and Development using Teradata and Informatica B:Understand ETL requirements and transform those to ETL design and code C:need to have skill in informatica br/> Technical Experience :A:Should have good hands-on experience of at-least 5 years with ETL design and development using Teradata B:Should have strong and working knowledge of Teradata C:Good to have Informatica and Unix/ Python knowledge D:Good to have working knowledge of Agile methodology br/> Professional Attributes :A:Candidate should possess strong analytical skills B:Candidate should possess strong communication skills C:Candidate should have ability to work under pressure br/> Educational Qualification :15 years of Full Time education Qualifications 15 years of Full Time education
Posted 3 months ago
2 - 7 years
4 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 2 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve working with Teradata BI, collaborating with cross-functional teams, and ensuring the quality and integrity of data. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to identify and prioritize business requirements and translate them into technical solutions. Ensure the quality and integrity of data by implementing data validation and testing procedures. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with Teradata BI. Strong understanding of database design and data modeling principles. Experience with ETL tools and processes. Familiarity with Agile development methodologies. Additional Information: The candidate should have a minimum of 2 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Pune office. Qualifications minimum 15 years Full time education
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Graduate Summary :As an Application Lead for Teradata BI, you will be responsible for leading the effort to design, build, and configure applications. You will act as the primary point of contact and work with a team to ensure successful project delivery. Your typical day will involve collaborating with cross-functional teams, managing project timelines, and ensuring the quality of deliverables. Roles & Responsibilities: Lead the effort to design, build, and configure applications using Teradata BI. Act as the primary point of contact for the project, collaborating with cross-functional teams to ensure successful project delivery. Manage project timelines and ensure the quality of deliverables. Provide technical guidance and mentorship to team members, ensuring their professional growth and development. Stay updated with the latest advancements in Teradata BI and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Teradata BI. Good To Have Skills:Experience in related technologies such as ETL, Data Warehousing, and Business Intelligence. Solid understanding of database concepts and SQL. Experience in leading and managing teams. Excellent communication and interpersonal skills. Additional Information: The candidate should have a minimum of 7.5 years of experience in Teradata BI. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Mumbai office. Qualifications Graduate
Posted 3 months ago
4 - 9 years
6 - 14 Lacs
Bengaluru
Work from Office
About The Role : * 6+ years of project hands-on experience as a DataStage Developer. * Strong analysis knowledge in DataStage jobs analysis and extracting the business logic and rule. * Strong analytical and problem-solving knowledge in DataStage Shell script. * Co-ordinate with ETL developer to help them understand the parallel and sequence job logic. * Experience in CICD /DevOps. * Experience in software development life cycle (Agile/Scrum). * Experience in Project life cycle activities including release/deployment on development projects. * Experience and desire to work in a Global delivery environment. * Strong communication and Analytical skills. * Ability to work in team in diverse/ multiple stakeholder environment. * Installation, upgrade, configuration, and troubleshooting/problem resolution of DataStage Jobs. * Provide support to the DataStage developers/programmers in implementing systems and resolving issues * Sound knowledge in DataStage jobs analysis including installation, upgrade configuration, backup/restore and recovery processes. * Demonstrated knowledge & experience with programming languages, specifically including Shell Script. * Solid interpersonal and business relationship skills, including the ability to work independently or on a team * Sound verbal & written communication skills Primary Skills Good knowledge in monitoring & debugging datastage jobs. good knowledge in sql & any other database. Basic knowledge in Unix shell scripting
Posted 3 months ago
4 - 6 years
6 - 8 Lacs
Hyderabad
Work from Office
Jr Req-Informatica Developer- 4-6 Yrs-Hyd-Ravi Kishore -mailto:ravikishore.tellapuram@tcs.com- TCS- C2H- 900000
Posted 3 months ago
6 - 9 years
12 - 16 Lacs
Bengaluru
Work from Office
Skills: Data Warehousing/ Blending, Cleaning, ETL, Tableau, PostgreSQL Designation: Senior Software Engineer Location: Kormangala - Bangalore Job Description: SmartStream is embarking its most exciting journey in recent times. To continue to service and support the world's top banks and asset managers we are constantly innovating to cement our industry leading position for the next decade. Working with our other product development centres in Bangalore, Mumbai, and Jacksonville. We are building modern software using constantly evolving agile and extreme practices. Independent autonomous teams build, maintain and support services which allows them to develop expertise in technology and business domains. The successful candidate will have the opportunity to influence others and to personally develop in a friendly and welcoming environment. We welcome engineering professionals from a variety of backgrounds to add to our diverse teams in our Bangalore development center. Java, Hibernate and Spring are our primary backend technologies with smartclient and JavaScript in our frontend. Job Responsibilities: Involved in the complete product lifecycle from initial requirements definition, design, development, and solution configuration through to deployment. Agile approach using Behavior Driven Development and Continuous Deployment technologies Support on-going maintenance and fixes of SmartStream's solutions and in-house toolkits. Follow SmartStream's development process and quality standards. Responsibility for developing scalable and robust solutions, which meet the high performance and availability standards of global financial institutions. 6 plus years experience developing and implementing enterprise-scale reports and dashboards, including :- Working with users in a requirements analysis role Extensive experience with data warehouse implementations Knowledge of logical and physical data modelling concepts (relational and dimensional) Proficiency with Tableau/Qlik Sense/Power BI product suite Performance tuning experience related to reporting queries required Understanding of data integration issues (validation and cleaning), familiarity with complex data and structures Excellent interpersonal (verbal and written) communication skills are required to support working in project environments that includes internal, external and customer teams Requires strong analytical, conceptual, and problem-solving abilities. Programming / scripting experience and knowledge of software development life cycle is preferred. Ability to manage multiple priorities and assess and adjust quickly to changing priorities. Participate in business analysis activities to gather required reporting and dashboard requirements. Translate business requirements into specifications that will be used to implement the required reports and dashboards, created from potentially multiple data sources. Participate with other specialist to convert legacy reports (primarily Business Objects) to Cognos BI solutions. Transition developed reports and dashboards to the Operations & Support team. Provide support as required to ensure the availability and performance of developed reports and dashboards for both external and internal users. Ensure proper configuration management and change controls are implemented for your sphere of influence. Provide technical assistance and cross training to other team members. Provide training and assistance to users for generation of ad hoc reports. Design and implement technology best practices, guidelines, and repeatable processes Must be able to perform duties with moderate to low supervision. Knowledge of Big Database and ETL skills is added advantage. Requires leadership qualities to mentor junior members of the team. Key Skills: Tableau Desktop, Tableau Server, SAP Business Objects Practical experience of SQL in relational databases like Oracle/ SQL Server and Application Server Middleware Good communication skills Good analytical skills Desirable Skills: Effective prioritization Ability to build good relationships with stakeholders Focus on Delivery and Customer Service Team worker Calm, methodical approach to problem solving Educational Qualification: Engineering Graduate with Computer Science / Information Technology background or similar Experience: 6 plus years of software engineering experience Experience developing in a software vendor environment is desirable but not required Financial software experience would be a bonus, but is not expected Need to have a can do attitude to problem management. Comfortable in a fast-moving environment with multiple customers and tasks. Works well in a busy environment and takes full ownership. Should be a go-getter Equality Statement: SmartStream is an equal opportunities employer. We are committed to promoting equality of opportunity and following practices which are free from unfair and unlawful discrimination.
Posted 3 months ago
4 - 9 years
15 - 30 Lacs
Pune, Bengaluru, Gurgaon
Work from Office
Experience: 3+ years of experience in building and maintaining data pipelines for diverse data sources and formats using tools such as Informatica PowerCenter and IICS . Conducted data quality checks and implemented data validation processes to ensure accuracy and consistency across cloud and on-premises environments. Collaborated with data scientists and analysts to provide reliable and timely data for analysis and reporting. Participated in setting up data engineering practices and standards for various projects and clients, including cloud migration strategies. Designed and optimized complex SQL queries for efficient data retrieval, transformation, and integration with cloud data warehouses like AWS Redshift and Snowflake . Implemented data monitoring and alerting systems to proactively address data pipeline issues and ensure continuous data availability. Assisted in building cloud-native data solutions on AWS , with a focus on services like S3 , Lambda , Redshift , and Athena , ensuring scalability, performance, and cost-effectiveness.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2