Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Summary We are looking for a Data Engineer to design, build, and manage a scalable Data Lake that integrates data from MySQL and multiple other structured/unstructured data sources. The ideal candidate will work on ETL pipelines, data modeling, and real-time streaming solutions to create a centralized, high-performance data infrastructure for analytics, reporting, and AI-driven insights. Job Responsibilities Data Lake Architecture & Implementation Design and develop a scalable Data Lake solution integrating MySQL, APIs, flat files, and NoSQL databases. Implement data ingestion pipelines using ETL/ELT processes for batch and real-time streaming data. Optimize data storage, partitioning, and indexing for high-performance query execution. ETL & Data Pipeline Development Develop and manage ETL workflows to extract, transform, and load data into the Data Lake. Automate data cleansing, normalization, deduplication, and transformation processes. Ensure efficient data orchestration using Apache Airflow, Prefect, or similar tools. Data Integration & Sources Connect and integrate MySQL, PostgreSQL, MongoDB, APIs, third-party platforms, and cloud storage (S3, GCS, Azure Blob) into the Data Lake. Implement real-time data streaming using Kafka, Kinesis, or Pub/Sub for event-driven architectures. Big Data & Cloud Technologies Utilize Hadoop, Spark, or Snowflake for distributed data processing. Deploy and manage Data Lake on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataproc). Optimize cost and performance of cloud-based data storage and processing. Data Governance, Security & Compliance Implement data governance, lineage, access control, and encryption policies. Ensure compliance with GDPR, CCPA, and other data privacy regulations. Develop monitoring & alerting mechanisms for data quality, integrity, and security. Collaboration & Business Insights Work closely with Data Scientists, Analysts, and BI teams to provide clean, enriched, and structured data. Support machine learning and AI-driven insights by designing optimized data pipelines. Define data cataloging, metadata management, and documentation for self-service analytics. Job Requirements Educational Qualification and Experience 5-8 years of experience in Data Engineering, ETL, and Big Data technologies. Strong expertise in SQL, MySQL, PostgreSQL, and NoSQL (MongoDB, Cassandra, DynamoDB, etc.). Technical Skills Hands-on experience with ETL tools (Apache Airflow, Talend, DBT, Glue, etc.). Experience with Big Data frameworks (Spark, Hadoop, Snowflake, Redshift, BigQuery). Proficiency in Python, Scala, or Java for data engineering workflows. Knowledge of Cloud Data Lakes (AWS S3/Glue, GCP BigQuery, Azure ADLS/Synapse). Strong experience with data modeling, schema design, and query optimization. Experience with Kafka, Kinesis, or Pub/Sub for real-time data processing. Exposure to Docker, Kubernetes, and CI/CD for data pipeline automation. Knowledge of Delta Lake, Iceberg, or Hudi for transactional data lakes. Understanding of ML feature stores and AI-driven data pipelines. Behavioural Skills Strategic thinking Planning and organizing Interpersonal Skills Stakeholder management People Leadership Innovation and Creativity Attention to detail Show more Show less
Posted 3 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: The position is based in India and will require the candidate to report directly to the team lead Developing projects from detailed business requirements, working through solutions and managing execution and rollout of these solutions, in the context of an overall consistent global platform Create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017 Understand basic data warehousing concepts. Design/Develop SSIS packages to pull data from various source systems and load to target tables. May be required to develop Dashboards and Reports using SSRS Work on BAU JIRAs and also do some levels of L3 support related activities whenever required. Providing detailed analysis and documentation of processes and flows where necessary. Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging The candidate should have the ability to operate with a limited level of direct supervision exercising independence of judgement and autonomy. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4+ yrs. of overall IT experience with 2+ years in Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys. 2+ yrs. experience in any ETL tool, preferably SSIS Some knowledge of Python can be a differentiator. Highly motivated, should need minimal hand holding with ability to multitask and work under pressure Strong analytical and problem solving skills; ability to analyze data for trends and quality checking Good to have: Talend, Github Knowledge Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL, tune existing SQL Experience with a reporting tool (e.g. SSRS, QlikView) is a plus Experience with a job scheduling tool (e.g. Autosys) Experience in Finance Industry is desired Experience with all phases of Software Development Life Cycle Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as BIG query, Cloud Data,Proc, Google Cloud Storage, Composer SQL concepts, Presto SQL, Hive SQL, Python (Pandas, Numpy, SciPy, Matplotlib) and Pyspark Design and implement scalable data pipelines using Google Cloud Dataflow Develop and optimize BigQuery datasets for efficient data analysis and reporting Collaborate with cross-functional teams to integrate data solutions with business processes Automate data workflows using Cloud Composer and Apache Airflow Ensure data security and compliance with GCP Identity and Access Management Mentor junior engineers in best practices for cloud-based data engineering Implement real-time data processing with Google Cloud Pub/Sub and Dataflow Continuously evaluate and adopt new GCP tools and technologies Show more Show less
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Mumbai
Work from Office
#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Talend Big Data . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Wissen Technology is Hiring for ETL Testing / QA Engineer About Wissen Technology: Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges. Role Overview: We are seeking an experienced ETL Testing / QA Engineer with a strong background in data warehousing, ETL processes, and banking domain expertise. The candidate will be responsible for validating data integration and transformation processes, ensuring data quality, and collaborating with cross-regional teams to resolve data and code issues efficiently. Experience: 7+ Years Location: Pune Key Responsibilities Perform ETL testing and validation of data workflows within data warehousing environments. Develop and execute detailed test plans and test cases to validate ETL processes, transformations, and data quality. Conduct SQL querying and debugging to validate data correctness and integrity. Collaborate with development, data engineering, and business teams to identify and troubleshoot data issues. Validate data-readiness and quality by identifying, analyzing, and addressing data inconsistencies or defects. Work closely with stakeholders across APAC, EMEA, and NA regions to identify root causes of data/code issues and implement permanent solutions. Use defect tracking tools like JIRA and Xray for managing test cases and defects. Prepare test documentation and reports to communicate testing progress and results effectively. Required Skills: 7+ years of hands-on experience in ETL Testing, with strong expertise in SQL and debugging. Solid understanding of Data Warehousing concepts and ETL tools such as Talend Cloud Data Integration, Pentaho/Kettle, or similar. Experience with Snowflake data platform and concepts around data lakes and cloud data warehousing. Working knowledge of Azure cloud and testing in Azure environments is a plus. Familiarity with Qlik Replicate and Compose tools (Change Data Capture) is considered an advantage. Strong exposure to banking/financial services domain (mandatory). Experience in defect and test management tools like JIRA and Xray. Excellent analytical, problem-solving, and debugging skills. Exposure to third-party financial data providers such as Bloomberg, Reuters, MSCI, or Rating Agencies is a plus. Prior experience with State Street or Charles River Development (CRD) systems is advantageous. Good working knowledge of productivity tools such as PowerPoint, Excel, and SQL scripting. The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products. We offer an array of services including Core Business Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud Adoption, Mobility, Digital Adoption, Agile & DevOps, Quality Assurance & Test Automation. Over the years, Wissen Group has successfully delivered $1 billion worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them with the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. Great Place to Work® Certification is recognized world over by employees and employers alike and is considered the ‘Gold Standard’. Wissen Technology has created a Great Place to Work by excelling in all dimensions - High-Trust, High-Performance Culture, Credibility, Respect, Fairness, Pride and Camaraderie. Website: www.wissen.com LinkedIn: https://www.linkedin.com/company/wissen-technology Wissen Leadership: https://www.wissen.com/company/leadership-team/ Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All Wissen Thought Leadership: https://www.wissen.com/articles/ Employee Speak: https://www.ambitionbox.com/overview/wissen-technology-overview Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5+ years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position: Sr. Cloud Data Engineer (AWS-Big Data) Location: Nagpur/Pune/Chennai/Bangalore Purpose of the Position: As a Sr. Cloud Data Engineer, this position requires candidate who are enthusiastic about specialized skills in AWS Services and Big Data.As a member of the team, you will help our clients, by building Models that supports to progress on their AWS cloud journey. Key Result Areas and Activities: Share and Build Expertise- Develop and share expertise in cloud solutioning domain and actively mine the experience and expertise in the organization for sharing across teams and clients in the firm. Support the cloud COE initiatives. Nurture and Grow Talent- Provide support for recruitment, coaching and mentoring, and building practice capacity in the firm in Cloud. AWS Integration- Integrate various AWS services to create seamless workflows and data processing solutions. Data Pipeline Development- Design, build, and maintain scalable data pipelines using AWS services to support data processing and analytics. Essential Skills: Knowledge of following AWS Services required: S3, EC2, EMR, Severless, Athena, AWS Glue, Lambda, Step Functions Cloud Databases– AWS Aurora, Singlestore, RedShift, Snowflake Big Data- Hadoop, Hive, Spark, YARN Programming Language– Scala, Python, Shell Scripts, PySpark Operating System- Any flavor of Linux, Windows Strong SQL Skills Orchestration Tools: Apache Airflow Hands-on in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations, etc. Desirable Skills: Experience and Technical Knowledge in Databricks Strong Experience with event stream processing technologies such as Kafka, KDS Knowledge of Operating System- Any flavor of Linux, ETL Tools Informatica, Talend Experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR) will be a plus Qualifications: Overall 7-9 years of IT experience & 3+ years of AWS related project Bachelors degree in computer science, engineering, or related field (Masters degree is a plus) Demonstrated continued learning through one or more technical certifications or related methods Qualities: Hold strong technical knowledge and experience Should have the capability to deep dive and research in various technical related fields Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary Very strong in Scala programming languages Big Data technologies including Spark Scala and Kafka have a good understanding of organizational strategy architecture patterns Microservices Event Driven and technology choices and coaching the team in execution in alignment to these guidelines Mandatory Certification anyone Cloudera CCA Spark and Hadoop Developer CCA175 Databricks Certified Developer Apache Spark 2X Hortonworks HDP Certified Apache Spark Developer Key Roles and Responsibilities Skill Big Data Spark SQL Scala Mandatory Certification any one Cloudera CCA Spark and Hadoop Developer CCA175 Databricks Certified Developer Apache Spark 2X Hortonworks HDP Certified Apache Spark Developer Job Description Experience in Scala programming languages Experience in Big Data technologies including Spark Scala and Kafka You have a good understanding of organizational strategy architecture patterns Microservices Event Driven and technology choices and coaching the team in execution in alignment to these guidelines You can apply organizational technology patterns effectively in projects and make recommendations on alternate options You have Handson experience working with large volumes of data including different patterns of data ingestion processing batch Realtime movement storage and access for both internal and external to BU and ability to make independent decisions within scope of project You have a good understanding of data structures and algorithms You can test debug and fix issues within established SLAs You can design software that is easily testable and observable You understand how teams goals fit a business need You can identify business problems at the project level and provide solutions You understand data access patterns streaming technology data validation data performance cost optimization Strong SQL skills ETL Talend preferred Any other ETL tool Experience with Linux OS user level Python or R programming skills good to have but not mandatory Skills : Primary Big Data Spark Scala developer Kafka Real time Streaming Secondary Hive Impala No SQL Starburst Optional Java Spring Show more Show less
Posted 3 weeks ago
2.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 2 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will perform continuous testing for security, API, and regression suite, create automation strategy, automated scripts, and support data and environment configuration. Participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and execute test cases, scripts, plans, and procedures. - Collaborate with cross-functional teams to ensure quality throughout the software development lifecycle. - Identify and report bugs and errors to ensure quality deliverables. - Implement automation testing strategies to improve efficiency. - Stay updated on industry best practices and technologies to enhance testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing. - Strong understanding of SQL and database concepts. - Experience with ETL tools such as Informatica or Talend. - Knowledge of data warehousing concepts and methodologies. - Hands-on experience in testing data integration and transformation processes. Additional Information: - The candidate should have a minimum of 2 years of experience in Data Warehouse ETL Testing. - This position is based at our Indore office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 3 weeks ago
2.0 - 4.0 years
0 Lacs
Mylapore, Tamil Nadu, India
On-site
Position Summary Company Fives India Engineering & Projects Pvt. Ltd. Job Title Data Analyst (BI developer) Job Location Chennai, Tamil Nadu, India Job Department IT Educational Qualification BE/B.Tech/MCA from a reputed Institute in Computer Science or related field. Work Experience 2 - 4 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 2 - 4 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Fives India Engineering & Projects Pvt. Ltd. Job Title Data Analyst (BI developer) Job Location Chennai, Tamil Nadu, India Job Department IT Educational Qualification BE/B.Tech/MCA from a reputed Institute in Computer Science or related field. Work Experience 2 – 4 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 2 – 4 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular Show more Show less
Posted 3 weeks ago
3.0 - 8.0 years
16 - 18 Lacs
Hyderabad
Work from Office
We are Hiring Data Management Specialist Level 2 for a US based IT Compnay based in Hyderabad. Job Title : Data Management Specialist Level 2 Location : Hyderabad Experience : 3+ Years CTC : 16 LPA - 18 LPA Working shift : Day shift We are seeking a Level 2 Data Management Specialist to join our data team and support the development, maintenance, and optimization of data pipelines and cloud-based data platforms. The ideal candidate will have hands-on experience with Snowflake , along with a solid foundation in SQL , data integration, and cloud data technologies. As a mid-level contributor, this position will collaborate closely with senior data engineers and business analysts to deliver reliable, high-quality data solutions for reporting, analytics, and operational needs. You will help develop scalable data workflows, resolve data quality issues, and ensure compliance with data governance practices. Key Responsibilities: Design, build, and maintain scalable data pipelines using Snowflake and SQL-based transformation logic Assist in developing and optimizing data models to support reporting and business intelligence efforts Write efficient SQL queries for data extraction, transformation, and analysis Collaborate with cross-functional teams to gather data requirements and implement dependable data solutions Support data quality checks and validation procedures to ensure data integrity and consistency Contribute to data integration tasks across various sources, including relational databases and cloud storage Document technical workflows, data definitions, and transformation logic for reference and compliance Monitor the performance of data processes and help troubleshoot workflow issues Required Skills & Qualifications: 24 years of experience in data engineering or data management roles Proficiency in Snowflake for data development or analytics Strong SQL skills and a solid grasp of relational database concepts Familiarity with ETL/ELT tools such as Informatica, Talend , or dbt Basic understanding of cloud platforms like AWS, Azure , or GCP Knowledge of data modeling techniques (e.g., star and snowflake schemas) Excellent attention to detail, strong analytical thinking, and problem-solving skills Effective team player with the ability to clearly communicate technical concepts Preferred Skills: Exposure to data governance or data quality frameworks Experience working in the banking or financial services industry Basic scripting skills in Python or Shell Familiarity with Agile/Scrum methodologies Experience using Git or other version control tools For further assistance contact/whatsapp : 9354909521 9354909512 or write to pankhuri@gist.org.in
Posted 3 weeks ago
2.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Guidewire developer at PwC, you will specialise in developing and customising applications using the Guidewire platform. Guidewire is a software suite that provides insurance companies with tools for policy administration, claims management, and billing. You will be responsible for designing, coding, and testing software solutions that meet the specific needs of insurance organisations. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Application Evolution Services (formerly Application Managed Services) team will provide you with the opportunity to help organizations harness the power of their enterprise applications by optimizing the technology while driving transformation and innovation to increase business performance. We assist our clients in capitalizing on technology improvements, implementing new capabilities, and achieving operational efficiencies by managing and maintaining their application ecosystems. We help our clients maximize the value of their investment by managing the support and continuous transformation of their solutions in the areas of sales, finance, supply chain, engineering, manufacturing and human capital. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Encourage everyone to have a voice and invite opinion from all, including quieter members of the team. Deal effectively with ambiguous and unstructured problems and situations. Initiate open and candid coaching conversations at all levels. Move easily between big picture thinking and managing relevant detail. Anticipate stakeholder needs, and develop and discuss potential solutions, even before the stakeholder realizes they are required. Contribute functional knowledge in your area of expertise. Contribute to an environment where people and technology thrive together to accomplish more than they could apart. Navigate the complexities of cross-border and/or diverse teams and engagements. Initiate and lead open conversations with teams, clients and stakeholders to build trust. Uphold the firm's code of ethics and business conduct. Within our global, Managed Services platform, we provide Application Evolution Services (formerly Application Managed Services), where we focus more so on the evolution of our clients’ applications and cloud portfolio. Our focus is to empower our client’s to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. Basic Qualifications Job Requirements and Preferences : Minimum Degree Required: Bachelor Degree or equivalent Preferred Qualifications Preferred Knowledge/Skills: Demonstrates Expert Abilities And Extensive Application Managed Service Projects And Solutioning The Datahub Integration With Guidewire Suite Of Applications On Premises And SaaS, With Proven Success Executing And Leading All Aspects Of Complex Engagements Within The Datahub Application Achieving On-time And On-budget Delivery, As Well As The Following 2+ years of experience as Data analyst for Datahub and its integration and reporting tools Strong understanding of data warehouse concepts, data mapping with integrations Good knowledge on SQL queries, analytical services, reporting services Experience with one or more SDLC methodologies Expertise related to metadata management, data modeling, data model rationalization, and database products Understands context of the project within the larger portfolio Demonstrated a strong attention to detail Possesses strong analytical skills Demonstrated a strong sense of ownership and commitment to program goals Strong verbal and written communication skills Identifies and capture operational database requirements and proposed enhancements in support of requested application development or business functionality Develops and translates business requirements into detailed data designs Maps between systems Assists development teams and QA teams as application data analyst and support production implementations Identifies entities, attributes, and referential relationships for data models using a robust enterprise information engineering approach Participates in data analysis, archiving, database design, and development activities for migration of existing data as needed Develops ETL interfaces from source databases and systems to the data warehouse Works closely with application development teams to ensure quality interaction with the database Job Functions To be responsible for providing functional guidance or solutions To develop and guide the team members in enhancing their functional understanding and increasing productivity To ensure process compliance in the assigned module, and participate in functional discussions or review. To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. Technologies Guidewire Datahub, Integration with Guidewire suite of applications and Conversion ETL tools SQL competence (and a grasp of database structure are required. Understanding of data modeling concepts. Knowledge of at least one ETL tool (Informatica, Snowflake SSIS, Talend, etc.) At PwC, our work model includes three ways of working: virtual, in-person, and flex (a hybrid of in-person and virtual). Visit the following link to learn more: https://pwc.to/ways-we-work. PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: https://pwc.to/H-1B-Lottery-Policy. All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law. PwC is proud to be an affirmative action and equal opportunity employer. For positions based in San Francisco, consideration of qualified candidates with arrest and conviction records will be in a manner consistent with the San Francisco Fair Chance Ordinance. For positions in Colorado, visit the following link for information related to Colorado's Equal Pay for Equal Work Act: https://pwc.to/coloradoadvisoryseniormanager. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
The ideal candidate should be a highly skilled Production Support Engineer with at least 3 years of relevant experience, with a strong focus on ETL and Data Warehouse. The candidate should have good understanding of DevOps practices and ITIL concepts. You will Monitor the daily data pipeline runs, ensure timely data loads by proactively identifying and troubleshooting issues. Perform RCA to identify the underlying causes of issues in the data warehouse and ETL pipelines. Document findings and implement corrective actions to prevent recurrence. Collaborate with various teams, including data engineers, DevOps Engineers, architects, and business analysts, to resolve issues and implement improvements. Communicate effectively with stakeholders to provide updates on issue resolution and system performance. Maintain detailed documentation of data warehouse configurations, ETL processes, operational procedures, and issue resolutions. Participate in an on-call rotation and operating effectively in a global 24 7 environment. Ensure data integrity and accuracy and take actions to resolve data discrepancies. Generate regular reports on system performance, issues, and resolutions. Your Skills Strong experience with Oracle databases and AWS cloud services. Proficiency in SQL and PL/SQL. Should be familiar with monitoring tools Dynatreace, CloudWatch, etc. Familiarity with other AWS services such as Account creation, VPC, Cloud Front, IAM, ALB, EC2, RDS, Route 53, Auto scaling, Lambda, etc. Experience with ETL tools and processes (e.g., Informatica, Talend, AWS Glue). Familiarity with scripting languages (e.g., Python, Shell scripting). Familiarity with DevOps tools and practices (e.g., GitHub, Jenkins, Docker, Kubernetes). Strong analytical and problem-solving abilities. Experience in performing root cause analysis and implementing corrective actions. Ability to work independently as well as in a collaborative team environment. Excellent written and verbal communication skills. Bachelor’s degree in computer science, Information Technology, or a related field. Minimum of 3 years of experience in a support engineer role, preferably in data warehousing and ETL environments. Certification in AWS, Oracle, or relevant DevOps tools is a plus. Your benefits: We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad. We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered. Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach. About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality Allianz Group is one of the most trusted insurance and asset management companies in the world. Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer. Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us. We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer. We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in. We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation. Join us. Let's care for tomorrow. Show more Show less
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Foreword: At iGreenTree, we're passionate about empowering energy and utility providers with innovative IT solutions. With deep domain knowledge and a dedication to innovation, we help our clients stay ahead of the curve in a rapidly changing industry. Whether you need IT consulting, application development, system integration, or digital transformation services, our team of experts has the expertise to deliver the right solution for your business. Partner with iGreenTree to unleash the power of technology and achieve sustainable growth in today's dynamic landscape. Who We Are Looking For: An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. This individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: Reviews existing database designs and data management procedures and provides recommendations for improvement. Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. Develop technical documentation as needed. Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. Define data architecture requirements for cross-product integration within and across cloud-based platforms. Analyze, architect, develop, validate and support integrating data into SaaS platforms (like ERP, CRM, etc.) from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. Perform thorough analysis of complex data and recommend actionable strategies. Effectively translate data modeling and BI requirements into the design process. Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling. Required Skills: Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, DataStage, etc. 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. Candidate should have any NoSQL technology exposure preferably MongoDB. Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.) Understanding data warehousing concepts and decision support systems. Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure, etc. Excellent communication and collaboration skills. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Summary Responsible for designing, building, delivering and maintaining software applications & services. Responsible for software lifecycle including activities such as requirement analysis, documentation/procedures and implementation. Job Description Roles and Responsibilities In This Role, You Will Collaborate with system engineers, frontend developers and software developers to implement solutions that are aligned with and extend shared platforms and solutions Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques Writes codes that meets standards and delivers desired functionality using the technology selected for the project Build features such as web services and Queries on existing tables Understand performance parameters and assess application performance Work on core data structures and algorithms and implement them using language of choice Education Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with basic experience. Desired Characteristics Technical Expertise Experience-3+ Years Frontend - Angular & React, .NET(Mandatory) Backend - Talend ETL tool (Mandatory) Search Engine - Solr DB - Microsoft SQL Server, Postgres Build & Deployment tools - Jenkins, Cruise control, Octopus Aware of methods and practices such as Lean/Agile/XP, etc. Prior work experience in an agile environment, or introductory training on Lean/Agile. Aware of and able to apply continuous integration (CI). General understanding of the impacts of technology choice to the software development life cycle. Business Acumen Has the ability to break down problems and estimate time for development tasks. Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team. Displays understanding of the project's value proposition for the customer. Shows commitment to deliver the best value proposition for the targeted customer. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision Personal/Leadership Attributes Voices opinions and presents clear rationale. Uses data or factual evidence to influence. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision. Completes assigned tasks on time and with high quality. Takes independent responsibility for assigned deliverables. Has the ability to break down problems and estimate time for development tasks. Seeks to understand problems thoroughly before implementing solutions. Asks questions to clarify requirements when ambiguities are present. Identifies opportunities for innovation and offers new ideas. Takes the initiative to experiment with new software frameworks Adapts to new environments and changing requirements. Pivots quickly as needed. When coached, responds to need & seeks info from other sources Write code that meets standards and delivers desired functionality using the technology selected for the project. Inclusion and Diversity GE HealthCare is an Equal Opportunity Employer where inclusion Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. Additional Information Relocation Assistance Provided: No Show more Show less
Posted 3 weeks ago
3.0 - 5.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Role Summary We are looking for a skilled Talend Developer to join our Digital Delivery team in Hyderabad. The ideal candidate will have 35 years of experience in building robust ETL pipelines, integrating diverse data sources, and ensuring high data quality using the Talend ecosystem. You will play a key role in delivering scalable and secure data integration solutions for enterprise clients. Key Responsibilities Design, develop, and deploy ETL workflows using Talend Open Studio or Talend Data Integration Extract, transform, and load data from multiple sources including databases, APIs, and flat files Implement data transformation logic, cleansing rules, and validation checks as per business requirements Optimize Talend jobs for performance, reliability, and scalability Integrate Talend solutions with cloud data warehouses and third-party systems (e.g., Snowflake, Azure, Salesforce) Troubleshoot and resolve issues in production and test environments Document ETL designs, data mappings, and integration specifications Required Skills & Experience • 35 years of hands-on experience in ETL development, with at least 2 years using Talend • Proficiency in SQL, relational databases, and data modeling concepts • Experience with REST/SOAP APIs, JSON/XML/CSV file formats • Strong understanding of data quality, error handling, and job scheduling • Hands-on knowledge of Git, basic scripting (Shell/Python), and CI/CD for ETL pipelines
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Work from Office
We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities: Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS , and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS , or Glue . Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams.
Posted 3 weeks ago
6.0 - 10.0 years
25 - 40 Lacs
Hyderabad
Work from Office
Design, build and maintain complex ELT/ETL jobs that deliver business value. Extract, transform and load data from various sources including databases, APIs, and flat files using IICS or Python/SQL. Translate high-level business requirements into technical specs Conduct unit testing, integration testing, and system testing of data integration solutions to ensure accuracy and quality Ingest data from disparate sources into the data lake and data warehouse Cleanse and enrich data and apply adequate data quality controls Provide technical expertise and guidance to team members on Informatica IICS/IDMC and data engineering best practices to guide the future development. Develop re-usable tools to help streamline the delivery of new projects Collaborate closely with other developers and provide mentorship Evaluate and recommend tools, technologies, processes and reference Architectures Work in an Agile development environment, attending daily stand-up meetings and delivering incremental improvements Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications and provide feedback on code quality, design, and performance
Posted 3 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana
On-site
Location: Gurgaon 100% onsite Detailed Job Description: Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision-making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. KEY RESPONSIBILITIES: Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing. Design and implement Databricks Notebooks for exploratory d ata analysis, ETL workflows, and machine learning models. Manage and optimize Databricks clusters for performance, cost efficiency, and scalability. Use Databricks SQL for advanced query development, data aggregation, and transformation. Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes. Develop solutions to i ntegrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration. Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics. Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations. Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance. Configure and manage Power BI workspaces, gateways, and permissions for secure data access. Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies. Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources. Provide end-user training and support for Power BI adoption across the organization. Develop and maintain ETL/ELT workflows , ensuring high data quality and reliability. Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies. Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems. Collaborate with data governance teams to enforce standards for metadata management and audit trails. Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems. Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy. Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies. Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks. Contributes to the enhancement of organizational analytics maturity through scalable and reusable architectures. REQUIRED SKILLS: Self-Management – You need to possess the drive and ability to deliver on projects without constant supervision. Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): What is your total work experience? How much experience do you have in Databricks? How much experience do you have in Power BI What is your Notice Period? Work Location: In person
Posted 3 weeks ago
0.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Position Summary Company Fives India Engineering & Projects Pvt. Ltd. Job Title Data Analyst (BI developer) Job Location Chennai, Tamil Nadu, India Job Department IT Educational Qualification BE/B.Tech/MCA from a reputed Institute in Computer Science or related field. Work Experience 2 – 4 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 2 – 4 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular
Posted 3 weeks ago
4.0 - 12.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Title: ETL Automation Tester with ETL, SQL , Python Job Title: ETL Testing Position: Senior Test Engineer Experience: 4- 12 Years Category: Software Testing/ Engineering Shift: 1PM - 10PM Main location: India, Karnataka, Bangalore Position ID: J0325-0882 Employment Type: Full Time We are seeking a highly skilled ETL ester with strong expertise in ETL processes, SQL, and a basic understanding of DevOps practices. This role is critical to ensuring the quality and efficiency of our data integration and automation projects. Your future duties and responsibilities Key Responsibilities: ETL Testing: Validate data transformations, data quality, and data integrity throughout the ETL pipelines. SQL Expertise: Write complex SQL queries to verify data correctness and perform thorough back-end testing of databases. Python: Good Working experience in scripting languages Test Planning & Execution: Create detailed test plans, test cases, and test reports. Perform both functional and non-functional testing. Defect Tracking: Identify, log, and track defects in collaboration with development teams to ensure timely resolution. Collaboration: Work closely with developers, data engineers, and product owners to understand requirements and improve overall product quality. Continuous Improvement: Stay updated with the latest testing tools, methodologies, and best practices to enhance the automation testing framework. Required qualifications to be successful in this role Experience: 4+ years of experience in automation testing with a focus on ETL, data warehousing, or big data environments. Technical Skills: Strong proficiency in SQL for complex queries and database testing. Hands-on experience with ETL tools (e.g., Informatica, Talend, etc.). Familiarity with DevOps practices and tools such as Jenkins, Git, Docker, etc. Analytical Skills: Strong analytical and problem-solving skills, with attention to detail in identifying issues and ensuring data accuracy. Communication: Excellent verbal and written communication skills to work effectively within a team and with stakeholders. Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Relevant certifications are a plus. Good to have: Experience with cloud platforms like AWS, Azure, or Google Cloud. Knowledge of big data technologies such as Hadoop, Spark, or similar. Experience with CI/CD pipelines and integrating automated tests. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 3 weeks ago
10.0 years
0 Lacs
India
On-site
Embark on an exciting journey into the realm of data analytics with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. As a Snowflake Data Architect you will be at the heart of the organization and support our clients to take control of their data and get value out of it by defining a reference architecture for our customers. This means that you will work closely with business leaders and information management teams to define and implement a roadmap on data management, business intelligence or analytics solutions.. If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! 🌍🔥 Relevant Experience: 10+ years in Data Practice Must have Skills: Snowflake, Data Architecture, Engineering, Governance, and Cloud services Responsibilities Assessments of existing data components, Performing POCs, Consulting to the stakeholders Lead the migration to Snowflake In a strong client-facing role, proposing, advocating, leading implementation of end to end solutions to an enterprise's data specific business problems, and taking care of data collection, extraction, integration, cleansing, enriching and data visualization. Ability to design large data platforms to enable Data Engineers, Analysts & scientists Strong exposure to different Data architectures, data lake & data warehouse, including migrations, rearchitect and platform modernization Define tools & technologies to develop automated data pipelines, write ETL processes, develop dashboard & report and create insights Continually reassess current state for alignment with architecture goals, best practices and business needs DB modeling, deciding best data storage, creating data flow diagrams, maintaining related documentation Taking care of performance, reliability, reusability, resilience, scalability, security, privacy & data governance while designing a data architecture Apply or recommend best practices in architecture, coding, API integration, CI/CD pipelines Coordinate with data scientists, analysts, and other stakeholders for data-related needs Help the Data Science & Analytics Practice grow by mentoring junior Practice members, leading initiatives, leading Data Practice Offerings Provide thought leadership by representing the Practice / Organization on internal / external platforms Qualification: Translate business requirements into data requests, reports and dashboards. Strong Database & modeling concepts with exposure to SQL & NoSQL Databases Strong data architecture patterns & principles, ability to design secure & scalable data lakes, data warehouse, data hubs, and other event-driven architectures Expertise in designing and writing ETL processes. Strong experience to Snowflake, and its components. Knowledge of Master Data management and related tools Strong exposure to data security and privacy regulations (GDPR, HIPAA) and best practices Skilled in ensuring data accuracy, consistency, and quality Experience of AWS services viz., AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, API Gateway, CloudWatch AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, Elasticsearch and Elastic Cache, API Gateway, CloudWatch Ability to implement data validation processes and establish data quality standards. Experience in Linux, and scripting Proficiency in data visualization tools like Tableau, Power BI or similar to create meaningful insights Additional Experience Desired: Experience working with data ingestion tools such as Fivetran, stitch, or Matillion AWS IOT solutions Apache NiFi, Talend, Informatica Knowledge of GCP Data services Exposure to AI / ML technologies Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Hi Professionals, Role : Talend developer Location : Coimbatore Experience : 4+Years Skills : Talend, any DB Notice period : Immediate to 15 Days Skills:- ETL and Talend Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.