Jobs
Interviews

1515 Talend Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

2 - 5 Lacs

Noida

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

2.0 years

0 Lacs

Mahesana, Gujarat, India

Remote

bEdge Tech Services ( www.bedgetechinc.com ) is urgently seeking a passionate and experienced Data Engineer to join our dynamic team in Ahmedabad, Gujarat! Are you ready to shape the future of tech talent? We're building a dedicated team to develop training materials, conduct live sessions, and mentor US-based clients and students. This is a unique opportunity to blend your data engineering expertise with your passion for teaching and knowledge sharing. This is a full-time, Work From Office position based in Ahmedabad. No remote or hybrid options are available. Location: Ahmedabad, Gujarat, India (Work From Office ONLY) Experience: 2 - 4 years Salary: ₹35,000 - ₹40,000 per month + Performance Incentives About the Role: As a key member of our US Client/Student Development team, you'll be instrumental in empowering the next generation of data engineering professionals. Your primary focus will be on: Content Creation: Designing and developing comprehensive and engaging training materials, modules, and exercises covering various aspects of data pipeline design, ETL, and data warehousing. Live Session Delivery: Conducting interactive live online sessions, workshops, and webinars, demonstrating complex data engineering concepts and practical implementations. Mentorship: Providing guidance, support, and constructive feedback to students/clients on their data engineering projects, helping them design robust data solutions and troubleshoot issues. Curriculum Development: Collaborating with the team to continuously refine and update data engineering course curricula based on industry trends, new technologies, and student feedback. Key Responsibilities: Develop high-quality training modules on data pipeline design, ETL/ELT processes, data warehousing concepts (dimensional modeling, Kimball/Inmon), and data lake architectures. Prepare and deliver engaging live sessions on setting up, managing, and optimizing data infrastructure on cloud platforms (AWS, Azure, GCP). Guide and mentor students in building scalable and reliable data ingestion, processing, and storage solutions using various tools and technologies. Explain best practices for data quality, data governance, data security, and performance optimization in data engineering. Create practical assignments, hands-on labs, and capstone projects that simulate real-world data engineering challenges. Stay updated with the latest advancements in big data technologies, cloud data services, and data engineering best practices. Required Skills & Experience: Experience: 2 to 4 years of hands-on industry experience as a Data Engineer or in a similar role focused on data infrastructure. Communication: Excellent and compulsory English communication skills (both written and verbal) – ability to articulate complex technical concepts clearly and concisely to diverse audiences is paramount. Passion for Teaching: A strong desire and aptitude for training, mentoring, and guiding aspiring data engineering professionals. Analytical Skills: Strong problem-solving abilities, logical thinking, and a structured approach to data infrastructure design. Work Ethic: Highly motivated, proactive, and able to work independently as well as collaboratively in a fast-paced environment. Location Commitment: Must be willing to work from our Ahmedabad office full-time . Required Technical Skills: Strong programming skills in Python (or Java/Scala) for data processing and scripting. Expertise in SQL and experience with relational database systems (e.g., PostgreSQL, MySQL, SQL Server) and/or NoSQL databases (e.g., MongoDB, Cassandra). Proven experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Fivetran, Data Factory). Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) and its data services (e.g., S3, Redshift, EMR, Glue, Data Lake, Data Factory, BigQuery, Dataproc). Familiarity with data warehousing concepts and data modeling techniques (Star Schema, Snowflake Schema). Experience with big data technologies (e.g., Apache Spark, Hadoop) is a significant advantage. Understanding of data governance, data security, and data lineage principles. What We Offer: A competitive salary and attractive performance-based incentives . The unique opportunity to directly impact the careers of aspiring tech professionals. A collaborative, innovative, and supportive work environment. Continuous learning and professional growth opportunities in a niche domain. Be a part of a rapidly growing team focused on global client engagement.

Posted 1 week ago

Apply

7.0 - 9.0 years

11 - 16 Lacs

Gurugram

Work from Office

Role Description : As a Technical Lead - Datawarehousing Development at Incedo, you will be responsible for designing and developing data warehousing solutions. You should have experience with ETL tools such as Informatica, Talend, or DataStage and be proficient in SQL. Roles & Responsibilities: Design and develop data warehousing solutions using tools like Hadoop, Spark, or Snowflake Write efficient and optimized ETL scripts Collaborate with cross-functional teams to develop and implement data warehousing features and enhancements Debug and troubleshoot complex data warehousing issues Ensure data security, availability, and scalability of production systems Technical Skills Skills Requirements: Proficiency in ETL (Extract, Transform, Load) processes and tools such as Informatica, Talend, or DataStage. Experience with data modeling and schema design for data warehousing applications. Knowledge of data warehouse technologies such as Amazon Redshift, Snowflake, or Oracle Exadata. Familiarity with business intelligence (BI) tools such as Tableau, Power BI, or QlikView. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred .

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

Role Description : As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description : As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

Role Description : As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

7.0 - 9.0 years

7 - 11 Lacs

Gurugram

Work from Office

Role Description : As a Technical Lead - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

5.0 - 6.0 years

4 - 7 Lacs

Bengaluru

Remote

Senior Database Administrator Req ID: 55552 Location: Bangalore, IN Sapiens is on the lookout for a Senior Database Administrator to become a key player in our Bangalore team. If you're a seasoned DBA pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ Digital (Data Suite) division, for more information about it, click here: https://sapiens.com/solutions/digitalsuite-customer-experience-and-engagement-software-for-insurers/ Designation: Senior Database Administrator Must Skills: 5-6 Years Experience in MS SQL Server DBA, Oracle DBA, MongoDB DBA, Azure DevOps, Azure Kubernetes Services, SAP BO/BODS, Apache Tomcat, Apache Superset and PowerBI tools. Criteria’s Job Requirements General Job Description A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Pre - requisites Knowledge & Experience Bachelor's degree in Engineering (B.E.) or an equivalent qualification. Ability to adapt quickly to new tools and technologies to support evolving infrastructure requirements. Experience supporting MS SQL, Oracle, MongoDB, or other NoSQL database environments. Ensure implementation of database standards and best practices. Test database solutions to ensure they meet functional and technical specifications. Monitor database performance and conduct performance tuning as needed. Collaborate with project management teams to implement and maintain configuration management for database environments. Foundational knowledge of MongoDB Atlas. Have knowledge of DevOps activities. Experience in DevOps tool administration and troubleshooting. Knowledge of Azure Kubernetes Services (AKS). Administration skills for both Windows and Linux operating systems. Understanding of SAP BusinessObjects tool administration and troubleshooting. Knowledge of SAP Data Services tool administration and troubleshooting. Basic familiarity with Apache Tomcat server administration. Knowledge of SAP IQ database administration, including tablespace management, database replication, and schema refresh tasks for application teams. Knowledge of Apache Superset administration on Linux platforms, including report issue troubleshooting. Understanding of Tableau administration, including Tableau Bridge configuration and support. Knowledge of Databricks workspace creation and access provisioning for project team members. Administrative knowledge of Microsoft Power BI Desktop and Power BI Gateway. Willingness to take ownership and provide support beyond DBA responsibilities, including tools like Talend Studio and DBeaver within ETL environments. Required Product/project Knowledge Ability to work in an agile development environment. Hand on experience in document preparation Proven experience in fine tuning and identifying the potential bottle necks on the applications, Infrastructure Required Skills Ability to work on tasks (POCs, Stories, Installations) without much help. Technical ability includes Troubleshooting skills. Ability to guide juniors in completion of POC, Stories, Installations Common Tasks Database Administration & Support Manage and support various database systems including MS SQL, Oracle, MongoDB, and SAP IQ. Monitor database performance and conduct regular performance tuning. Implement and enforce database standards, security, and best practices. Handle backup, recovery, and replication for disaster recovery scenarios. DevOps and Infrastructure Support Assist with basic DevOps tasks including tool administration and troubleshooting. Support and maintain environments running on Azure Kubernetes Services (AKS). Perform basic system administration tasks on Windows and Linux servers. Application & Tool Support Provide administrative support for SAP BusinessObjects, SAP Data Services, and Apache Tomcat. Maintain and troubleshoot data visualization and reporting tools like Tableau, Apache Superset, Power BI, and Databricks. Environment Setup & Access Control Create and configure workspaces (e.g., Databricks) and manage user access provisioning. Support configuration management and coordinate with project teams for database deployments. Documentation & Knowledge Sharing Create and maintain technical documentation for configuration management and knowledge bases. Contribute to troubleshooting guides and SOPs for Tier 1 and Tier 2 support teams. Cross-functional Tool Support Provide L1/L2 level support for tools like Talend Studio and DBeaver in ETL and data integration tasks. Troubleshoot and support MongoDB Atlas and related NoSQL environments. Required Soft Skills Providing technical leadership Collaboration and teamwork skills Self-motivated with strong initiative and excellent Communication Skills Abilities of becoming a technical activity leader Proactive and initiative approach Self-motivated, flexible and a team player Have good understanding of the requirements Disclaimer: Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset

Posted 1 week ago

Apply

6.0 years

1 - 4 Lacs

Jaipur

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Lead Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In this role, you will: Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In this role, you will have: Bachelors degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture preferred but not required SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Mulesoft, FiveTran, AirFlow, AirByte, Matillion,Talend, Informatica, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Location: Mumbai About Us StayVista is India’s largest villa hospitality brand and has redefined group getaways. Our handpicked luxury villas are present in every famous holiday destination across the country. We curate unique experiences paired with top-notch hospitality, creating unforgettable stays. Here, you will be a part of our passionate team, dedicated to crafting exceptional getaways and curating one-of-a-kind homes. We are a close-knit tribe, united by a shared love for travel and on a mission to become the most loved hospitality brand in India. Why Work With Us? At StayVista, you're part of a community where your ideas and growth matter. We’re a fast-growing team that values continuous improvement. With our skill upgrade programs, you’ll keep learning and evolving, just like we do. And hey, when you’re ready for a break, our villa discounts make it easy to enjoy the luxury you help create. Your Role As an Manager – Business Intelligence, you will lead data-driven decision-making by transforming complex datasets into strategic insights. You will optimize data pipelines, automate workflows, and integrate AI-powered solutions to enhance efficiency. Your expertise in database management, statistical analysis, and visualization will support business growth, while collaboration with leadership and cross-functional teams will drive impactful analytics strategies. About You 8+ years of experience in Business Intelligence, Revenue Management, or Data Analytics, with a strong ability to turn data into actionable insights. Bachelor’s or Master’s degree in Business Analytics, Data Science, Computer Science, or a related field. Skilled in designing, developing, and implementing end-to-end BI solutions to improve decision-making. Proficient in ETL processes using SQL, Python, and R, ensuring accurate and efficient data handling. Experienced in Google Looker Studio, Apache Superset, Power BI, and Tableau to create clear, real-time dashboards and reports. Develop, Document & Support ETL mappings, Database structures and BI reports. Develop ETL using tools such as Pentaho/Talend or as per project requirements. Participate in the UAT process and ensure quick resolution of any UAT issue or data issue. Manage different environments and be responsible for proper deployment of reports/ETLs in all client environments. Interact with Business and Product team to understand and finalize the functional requirements Responsible for timely deliverables and quality Skilled at analyzing industry trends and competitor data to develop effective pricing and revenue strategies. Demonstrated understanding of data warehouse concepts, ETL concepts, ETL loading strategy, data archiving, data reconciliation, ETL error handling, error logging mechanism, standards and best practices Cross-functional Collaboration Partner with Product, Marketing, Finance, and Operations to translate business requirements into analytical solutions. Key Metrics: what you will drive and achieve Data Driven Decision Making &Business Impact. Revenue Growth & Cost Optimization. Cross-Functional Collaboration & Leadership Impact BI & Analytics Efficiency and AI Automation Integration Our Core Values: Are you a CURATER? Curious : Here, your curiosity fuels innovation. User-Centric : You’ll anticipate the needs of all our stakeholders and exceed expectations. Resourceful : You’ll creatively optimise our resources with solutions that elevate experiences in unexpected ways. Aspire : Keep learning, keep growing—because we’re all about continuous improvement. Trust : Trust is our foundation. You’ll work in a transparent, reliable, and fair environment. Enjoy : We believe in having fun while building something extraordinary. Business Acumen: You know our services, business drivers, and industry trends inside out. You anticipate challenges in your area, weigh the impact of decisions, and track competitors to stay ahead, viewing risk as a chance to excel. Change Management: You embrace change and actively look for opportunities to improve efficiency. You navigate ambiguity well, promote innovation within the team, and take ownership of implementing fresh ideas. Leadership: You provide direction, delegate effectively, and empower your team to take ownership. You foster passion and pride in achieving goals, holding yourself accountable for the team’s successes and failures. Customer Centricity: You know your customers’ business and proactively find solutions to resolve their challenges. By building rapport and anticipating issues, you ensure smooth, win-win interactions while keeping stakeholders in the loop. Teamwork: You actively seek input from others, work across departments, and leverage team diversity to drive success. By fostering an open environment, you encourage constructive criticism and share knowledge to achieve team goals. Result Orientation: You set clear goals for yourself and your team, overcoming obstacles with a positive, solution-focused mindset. You take ownership of outcomes and make informed decisions based on cost-benefit analysis. Planning and Organizing: You analyze information systematically, prioritize tasks, and delegate effectively. You optimize processes to drive efficiency and ensure compliance with organizational standards. Communication: You communicate with confidence and professionalism, balancing talking and listening to foster open discussions. You identify key players and use the right channels to ensure clarity and gain support. StayVista is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decisions based on race, colour, religion, caste, creed, nationality, age, sex, including pregnancy, childbirth, or related medical conditions, marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or any other characteristic protected under applicable laws.

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Gurugram

Work from Office

Role Description As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Lead Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In This Role, You Will Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have Bachelors degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture preferred but not required SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Mulesoft, FiveTran, AirFlow, AirByte, Matillion,Talend, Informatica, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)

Posted 1 week ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Sr Devloper with special emphasis and experience of 8 to 10 years on Pyspark , Python and SQL along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans.5+ years of sound knowledge on Pyspark to implement ETL logics. Strong understanding of frontend technologies such as HTML, CSS, React & JavaScript. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 week ago

Apply

10.0 years

20 - 25 Lacs

India

Remote

Job Title: Senior Data Engineer – Dataiku Experience: 6–10 Years Location: [Remote] Employment Type: [Contract] Shift Timing: [General] Immediate Joiners Preferred Role Overview We are seeking a highly skilled and motivated Senior Data Engineer with hands-on experience in Dataiku, advanced data modeling, ETL/ELT processes, and Python/SQL development. The ideal candidate will have a strong data engineering foundation, exposure to cloud platforms, and a working understanding of Generative AI concepts. This role is pivotal in designing and building robust, scalable, and production-grade data solutions. Key Responsibilities Leverage Dataiku to build end-to-end data pipelines, prepare and transform data, and deliver insightful visualizations and analytics. Design and implement scalable data models using best practices such as dimensional modeling (Kimball/Inmon methodologies). Develop and maintain ETL/ELT workflows using Dataiku, and optionally tools like Apache Airflow, Talend, or SSIS. Integrate and process large datasets from diverse sources into cloud-based environments (AWS / Azure). Write robust and optimized Python scripts and complex SQL queries to automate data workflows and support analytics. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data architecture. Explore and prototype Gen AI and LLM Mesh frameworks to enhance data engineering capabilities. Follow best practices for data quality, governance, and documentation. Required Skills & Experience Proficiency in Dataiku for pipeline creation, data transformation, and analytics workflows. Strong expertise in data modeling techniques (e.g., star schema, snowflake schema, normalized/denormalized models). Hands-on experience with ETL/ELT processes and tools (Dataiku, Airflow, Talend, SSIS, etc.). Solid programming knowledge in Python and strong SQL skills. Experience working with cloud platforms such as AWS or Azure (e.g., S3, EC2, Data Lake, Synapse). Familiarity with LLM Mesh or similar Gen AI frameworks. Understanding of Generative AI use cases in data engineering. Strong problem-solving and debugging capabilities. Excellent communication and stakeholder collaboration skills. Nice to Have Experience with big data technologies like Apache Spark, Hadoop, or Snowflake. Understanding of data governance and data security principles. Familiarity with MLOps tools and workflows. Contributions to open-source data engineering or AI/ML projects. Skills: data,data engineering,dataiku,python,etl/elt,generative ai,cloud platforms (aws, azure),data modeling,stakeholder collaboration,etl,aws,cloud,problem-solving,sql,modeling,analytics

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You are a Senior ETL & Data Migration QA Engineer with 4 to 6 years of experience, and are based in Hyderabad, India with a requirement of working 5 days in the office. Your role involves contributing to a high-impact data migration project by leading Quality Assurance efforts. Your primary responsibilities include designing and implementing test strategies, developing test cases, executing data validation tests, and ensuring the accuracy of data transformations across global teams. Your key responsibilities include: - Designing robust test strategies and plans for data migration and ETL processes. - Executing detailed test cases to validate data accuracy, completeness, and consistency. - Performing SQL-based data validation and transformation testing. - Utilizing ETL tools like Talend, Informatica PowerCenter, or DataStage for validating data pipelines. - Validating semi-structured data formats such as JSON and XML. - Leading QA efforts for cloud data migration projects, specifically to Snowflake. - Coordinating testing activities across on-shore and off-shore teams. - Documenting test results and defects, collaborating with development teams for resolution. - Contributing to the development of automated testing frameworks for ETL processes. - Promoting QA best practices and driving continuous improvement initiatives. To be successful in this role, you need: - 3+ years of experience in QA focusing on ETL testing, data validation, and data migration. - Proficiency in SQL for complex queries and data validation. - Hands-on experience with ETL tools like Talend, Informatica PowerCenter, or DataStage. - Experience with cloud data platforms, especially Snowflake. - Strong understanding of semi-structured data formats like JSON and XML. - Excellent analytical and problem-solving skills. - Experience working in distributed teams and leading QA efforts. Preferred skills include: - Experience with automated testing tools for ETL processes. - Knowledge of data governance and data quality standards. - Familiarity with AWS or other cloud ecosystems. - ISTQB or equivalent certification in software testing.,

Posted 1 week ago

Apply

4.0 - 5.0 years

0 Lacs

Greater Kolkata Area

On-site

Role : Data Integration Specialist Experience : 4 - 5 Years Location : India Employment Type : Full-time About The Role We are looking for a highly skilled and motivated Data Integration Specialist with 4 to 5 years of hands-on experience to join our growing team in India. In this role, you will be responsible for designing, developing, implementing, and maintaining robust data pipelines and integration solutions that connect disparate systems and enable seamless data flow across the enterprise. You'll play a crucial part in ensuring data availability, quality, and consistency for various analytical and operational needs. Key Responsibilities ETL/ELT Development : Design, develop, and optimize ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes using industry-standard tools and technologies. Data Pipeline Construction : Build and maintain scalable and efficient data pipelines from various source systems (databases, APIs, flat files, streaming data, cloud sources) to target data warehouses, data lakes, or analytical platforms. Tool Proficiency : Hands-on experience with at least one major ETL tool such as Talend, Informatica PowerCenter, SSIS, Apache NiFi, IBM DataStage, or similar platforms. Database Expertise : Proficient in writing and optimizing complex SQL queries across various relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL) and NoSQL databases. Cloud Data Services : Experience with cloud-based data integration services on platforms like AWS (Glue, Lambda, S3, Redshift), Azure (Data Factory, Synapse Analytics), or GCP (Dataflow, BigQuery) is highly desirable. Scripting : Develop and maintain scripts (e.g., Python, Shell scripting) for automation, data manipulation, and orchestration of data processes. Data Modeling : Understand and apply data modeling concepts (e.g., dimensional modeling, Kimball/Inmon methodologies) for data warehousing solutions. Data Quality & Governance : Implement data quality checks, validation rules, and participate in establishing data governance best practices to ensure data accuracy and reliability. Performance Tuning : Monitor, troubleshoot, and optimize data integration jobs and pipelines for performance, scalability, and reliability. Collaboration & Documentation : Work closely with data architects, data analysts, business intelligence developers, and business stakeholders to gather requirements, design solutions, and deliver data assets. Create detailed technical documentation for data flows, mappings, and transformations. Problem Solving : Identify and resolve complex data-related issues, ensuring data integrity and consistency. Qualifications Education : Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related quantitative field. Experience : 4 to 5 years of dedicated experience in data integration, ETL development, or data warehousing. Core Skills : Strong proficiency in SQL and at least one leading ETL tool (as listed above). Programming : Hands-on experience with Python or Shell scripting for data manipulation and automation. Databases : Solid understanding of relational database concepts and experience with various database systems. Analytical Thinking : Excellent analytical, problem-solving, and debugging skills with attention to detail. Communication : Strong verbal and written communication skills to articulate technical concepts to both technical and non-technical audiences. Collaboration : Ability to work effectively in a team environment and collaborate with cross-functional teams. Preferred/Bonus Skills Experience with real-time data integration or streaming technologies (e.g., Kafka, Kinesis). Knowledge of Big Data technologies (e.g., Hadoop, Spark). Familiarity with CI/CD pipelines for data integration projects. Exposure to data visualization tools (e.g., Tableau, Power BI). Experience in specific industry domains (e.g., Finance, Healthcare, Retail) (ref:hirist.tech)

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies