Home
Jobs
Companies
Resume

804 Talend Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Primary skills:Technology->Life Sciences->LIMS Experience in developing instrument drivers using SDMS/Talend/Java is to have. At least 5 years of experience in software development life cycle. At least 5 years of experience in Project life cycle activities on development and maintenance projects. At least 5 years of experience in Design and architecture review. Good understanding of sample management domain and exposure to life sciences projects Ability to work in team in diverse/ multiple stakeholder environment Analytical skills Very Good Communication skills At least 4-8 years of experience in LIMS(LV/LW) – Implementation /Configuration/Customization using Java, Java script. integration with Lab applications, and should have implemented at least 2-3 projects with role involving development using LabVantage platform and Jasper/iReport/Java reporting tool Interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation and translating to functional and/or design documentation for LabVantage LIMS solution, Application Architecture definition and Design, Development, Validation and release. Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Show more Show less

Posted 2 days ago

Apply

0.0 - 2.0 years

0 Lacs

Pune

On-site

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Essential: Experience primarily with SAP Business Objects and Talend ETL Experience with any one other Business Intelligence tools like Tableau/COGNOS, ETL tools - Abinitio/Spark and unix shell scripting Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Should be able to understand and modify complex universe queries, design and use the functionalities of Web Intelligence tool. Familiarity with identification and resolution of data quality issues. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Knowledge of any application server (Weblogic, WAS, Tomcat etc) Adjacent Skills: Apache Spark with java Good Understanding of Bigdata and Hadoop ecosystem Good Understanding of Hive and Impala Testing frameworks (test driven development) Good communication skills Knowledge of Maven, Python scripting skills Good problem solving skills Beneficial: EMS, Kafka, Domain Knowledge - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

0 years

2 - 7 Lacs

Bengaluru

On-site

QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Performing all testing activities for initiatives across one or more assigned projects, utilizing processes, methods, metrics and software that ensure the quality, reliability and systems safety and security and Hoverfly Component and contract Testing Embedded Cassandra Component Testing on multiple repositor. QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Test strategy formulation will include decomposing the business and technical requirements into test case scenarios, defining test data requirements, managing test case creation, devising contingencies plans and other preparation activities. Development of the test case execution plan, test case execution, managing issues, and status metrics. Working with a global team and responsible for directing/reviewing the test planning and execution work efforts of an offshore team. Communicating effectively with business units, IT Development, Project Management and other support staff on testing timelines, deliverables, status and other information. Assisting in the project quality reviews for your assigned applications Assessing risk to the project based on the execution and validation and making appropriate recommendations Ability to interpret quality audits, drive improvements and change, and facilitate test methodology discussions across the business unit Providing project implementation support on an as needed basis and Assisting with application training of new resources. Ability to create and manage project plans and activity timelines and Investigating, monitoring, reporting and driving solutions to issues Acting as a liaison between the Line of Business testing resources and the development team Identifying and creating risk mitigation activities, and developing and implementing process improvements. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices. Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. ● Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules ● Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. ● Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. ● Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. ● Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). ● Experience with scripting languages to automate data testing. ● Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Linkedin logo

Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less

Posted 3 days ago

Apply

7.0 - 15.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world’s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery).  Experiencewithdata governanceframeworks and tools. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Linkedin logo

Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Linkedin logo

Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Title – ETL Testing – Python & SQL Candidate Specification – 5+ years, Open for Shift – 1PM to 10 PM. ETL (Python) – all 5 days WFO, ETL (SQL) – Hybrid. Location – Chennai. Job Description Experience in ETL testing or data warehouse testing. Strong in SQL Server, MySQL, or Snowflake. Strong in scripting languages Python. Strong understanding of data warehousing concepts, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling. Proficient in writing SQL queries for data validation and reconciliation. Experience with testing tools such as HP ALM, JIRA, TestRail, or similar. Excellent problem-solving skills and attention to detail. Skills Required RoleETL Testing Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL PYTHON SQL Other Information Job CodeGO/JC/185/2025 Recruiter NameSheena Rakesh Show more Show less

Posted 3 days ago

Apply

4.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute In this role of Kinaxis Analyst you will work closely with IT and business users globally to manage end-to-end supply chain planning process using Kinaxis Rapid Response. You will have the ability to contribute and make a difference in a challenging and exciting Hi-Tech environment. Collaborate with users to understand business requirements and configure solutions in Rapid Response Design, Build, Test and Deploy Rapid Response resources including Workbooks, Alerts, Forms, Scripts, Automation Chains, Widgets, Dashboard etc. Demonstrated understanding of Control Tables, Data Model and Mappings that can be leveraged in solving business problems` As a core member of Kinaxis COE team, apply best practices in Rapid Response configuration to meet business needs Assist end users with interpreting the planning results Facilitate, plan and support new functionality releases including periodic Kinaxis service updates. The Must Haves Strong understanding of MRP and RR analytics Deep understanding of RR data integration using TALEND or similar ETL tools Excellent analytical and troubleshooting skills Exceptional interpersonal skills and ability to communicate effectively, both verbal and in writing Ability to manage multiple priorities and perform well in a fast-paced environment Strong work ethic and high attention to detail Bachelors in science, Technology, Engineering, Mathematics or related field Minimum 4-6 years of firsthand experience with Kinaxis Rapid Response or similar planning tool Assets Kinaxis certification (level 2 and above) preferred. APICS certification a plus Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

Linkedin logo

Job Description – Snowflake, Talend, Strong experience in SQL development and query optimization, Understanding of database indexing, partitioning, and performance tuning Data warehousing concepts including Star Schema and Dimension Modeling, Dimension tables for analytics and reporting, Data partitioning and performance tuning techniques in DWH environments TeamCity, Jenkins, GIT, ETL Scripts, DB Objects experience, Job schedulers(execution, monitoring), Branching, Merging, Automated deployment for ETL processes Secondary - Excellent Cloud Exposure Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Manager - Guidewire Data Manager EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We are seeking an experienced Guidewire Manager with a strong background in the insurance domain and extensive knowledge of traditional ETL tools. The ideal candidate will have a robust understanding of data warehousing architecture and hands-on experience with various ETL tools, including Informatica PowerCenter, SSIS, SAP BODS, and Talend. Your Key Responsibilities Lead and manage Guidewire implementation projects, ensuring alignment with business objectives and technical requirements. Oversee the design, development, and maintenance of data warehousing solutions. Collaborate with cross-functional teams to gather and analyze business requirements. Develop and implement ETL processes using tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Ensure data quality, integrity, and security across all data warehousing and ETL processes. Provide technical guidance and mentorship to team members. Stay updated with industry trends and best practices in data warehousing and ETL. Skills And Attributes For Success Bachelor's degree in Computer Science, Information Technology, or a related field. 8-11 years of experience in data warehousing and ETL processes. Strong background in the insurance domain. Hands-on experience with ETL tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Excellent understanding of data warehousing architecture and best practices. Proven leadership and project management skills. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. To qualify for the role, you must have Experience with Guidewire implementation projects. Knowledge of additional ETL tools and technologies. Certification in relevant ETL tools or data warehousing technologies. Why EY? At EY, we offer a dynamic and inclusive work environment where you can grow your career and make a meaningful impact. Join us to work on challenging projects, collaborate with talented professionals, and contribute to innovative solutions in the insurance domain. Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about Life insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

We're Hiring: Integration Consultant (Kinaxis Maestro) Work from Anywhere | Remote Opportunity Full-Time | 2–4 Years Experience Are you a Kinaxis Maestro pro with a passion for building seamless integrations that drive supply chain transformation? Join Simbus Tech, a trusted Kinaxis partner, and work with global clients to enable smarter, faster supply chain decisions. What You’ll Do Design and implement robust integration solutions for Kinaxis RapidResponse using Maestro. Develop and maintain data pipelines using APIs, ETL tools, and middleware. Collaborate with functional and technical teams to align integration strategies with business needs. Troubleshoot integration issues and ensure data consistency across systems. Contribute to continuous improvement and documentation of integration best practices. What We’re Looking For Bachelor’s degree in Computer Science, Information Technology, or a related field. 2–4 years of experience as a Kinaxis Integration Consultant or similar role. Strong hands-on expertise in Kinaxis Maestro and integration frameworks. Proficiency in APIs, ETL processes, and tools like Talend, Mulesoft, or Snowflake. Relevant certifications in Kinaxis Maestro or integration technologies (preferred). Strong problem-solving skills, communication, and a collaborative mindset. Why Join Simbus Tech? 100% Remote | Work from Anywhere Exposure to cutting-edge supply chain technologies Collaborate with top-tier global clients Growth-driven, people-first culture Show more Show less

Posted 3 days ago

Apply

0.0 - 2.0 years

0 Lacs

India

On-site

Linkedin logo

We’re Hiring: Data Engineer About The Job Duration: 12 Months Location: PAN INDIA Timings: Full Time (As per company timings) Notice Period: within 15 days or immediate joiner Experience: 0- 2 years Responsibilities Job Description Design, develop and maintain reliable automated data solutions based on the identification, collection and evaluation of business requirements. Including but not limited to data models, database objects, stored procedures and views. Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components Support and troubleshoot the data environment (including periodically on call) Document technical artifacts for developed solutions Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills Proactive approach to problem solving with effective influencing skills Familiar with Agile practices and methodologies Education And Experience Requirements Four-year degree in Information Systems, Finance / Mathematics, Computer Science or similar 0-2 years of experience in Data Engineering REQUIRED KNOWLEDGE, SKILLS Or ABILITIES Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience a plus Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modelling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Talend, Apache Airflow, Azure Data Factory, and BI tools like Tableau preferred Analyse data models We are looking for a Senior Data Engineer for the Enterprise Data Organization to build and manage data pipeline (Data ingest, data transformation, data distribution, quality rules, data storage etc.) for Azure cloud-based data platform. The candidate will require to possess strong technical, analytical, programming and critical thinking skills. Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic HR consulting firm dedicated to optimizing human resource management and development. Our mission is to bridge the gap between talent and opportunity, driving growth and success for both our clients and candidates. We foster a culture of collaboration, innovation, and integrity, consistently striving to deliver exceptional service in the evolving landscape of human resources. Role Responsibilities Design, develop, and implement ETL processes using Talend. Collaborate with data analysts and stakeholders to understand data requirements. Perform data cleansing and transformation tasks. Optimize and automate existing data integration workflows. Monitor ETL jobs and troubleshoot issues as they arise. Conduct performance tuning of Talend jobs for efficiency. Document ETL processes and maintain technical documentation. Work closely with cross-functional teams to support data needs. Ensure data integrity and accuracy throughout the ETL process. Stay updated with Talend best practices and upcoming features. Assist in the migration of data from legacy systems to new platforms. Participate in code reviews to ensure code quality and adherence to standards. Engage in user training and support as necessary. Provide post-implementation support for deployed solutions. Evaluate and implement new data tools and technologies. Qualifications 3+ years of experience as a Talend Developer. Strong understanding of ETL principles and practices. Proficiency in Talend Open Studio. Hands-on experience with SQL and database management. Familiarity with data warehousing concepts. Experience using Java for Talend scripting. Knowledge of APIs and web services. Effective problem-solving skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and accuracy in data handling. Experience with job scheduling tools. Ability to manage multiple priorities and deadlines. Knowledge of data modeling concepts. Experience in documentation and process mapping. Skills: data cleansing,data warehousing,job scheduling tools,problem solving,team collaboration,sql,documentation,digital : talend open studio,talend,data transformation,data modeling,performance tuning,web services,api development,java,apis,data integration,etl processes Show more Show less

Posted 3 days ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary We are looking for a talented and motivated Data Engineer with 3 to 6 years of experience to join our data engineering team. The ideal candidate will have strong SQL skills, hands-on experience with Snowflake, ETL tools like Talend, DBT for transformation workflows, and a solid foundation in AWS cloud services. Key Responsibilities Design, build, and maintain efficient and reliable data pipelines using SQL, Talend, and DBT Develop and optimize complex SQL queries for data extraction and transformation Manage and administer Snowflake data warehouse environments Collaborate with analytics, product, and engineering teams to understand data requirements Implement scalable data solutions on AWS (e.g., S3, Lambda, Glue, Redshift, EC2) Monitor and troubleshoot data workflows and ensure data quality and accuracy Support deployment and version control of data models and transformations Write clear documentation and contribute to best practices Required Skills And Qualifications 3 to 6 years of experience in data engineering or related fields Strong expertise in SQL and performance tuning of queries Hands-on experience with Snowflake (data modeling, security, performance tuning) Proficiency with Talend for ETL development Experience with DBT (Data Build Tool) for transformation workflows Good knowledge of AWS services, especially data-centric services (S3, Lambda, Glue, etc.) Familiarity with Git-based version control and CI/CD practices Strong analytical and problem-solving skills Show more Show less

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Are you a data-driven professional with a knack for business intelligence and sales analytics? Do you excel at transforming complex datasets into actionable insights that drive business success? If yes, Marconix is looking for a Business Analyst (Sales Analyst) to join our team! Location: Hyderabad Salary: Up to ₹3 LPA (CTC) Work Mode: Work from Office Why Join Us? Work with a fast-growing and innovative sales solutions company Hands-on experience in business intelligence and sales analytics Opportunity to work with top-tier clients and industry leaders Sales Data Management & Reporting Transform raw sales data into valuable business insights using BI tools (Tableau, Power BI, etc.). Develop and deploy robust reporting dashboards for tracking performance metrics. Manage ETL processes (Extract, Transform, Load) to streamline data flow. Analyze large datasets to identify market trends and business growth opportunities. Business Intelligence & Analytics Develop data-driven strategies to optimize sales performance. Build predictive models to forecast sales trends and customer behavior. Conduct deep-dive analysis on business performance and suggest data-backed improvements. Work closely with stakeholders to understand their requirements and provide customized analytical solutions. Client & Team Management Act as the primary liaison between business and technical teams. Gather and analyze business requirements to enhance operational efficiency. Provide strategic recommendations to clients and internal teams based on data insights. What We Expect from You: Educational Background:Tech / BE / BCA / BSc in Computer Science, Engineering, or a related field. Experience: Relevant: 2+ years as a Business Analyst focusing on sales reporting & analytics Must-Have Skills: Strong expertise in BI tools (Tableau, Power BI, Oracle BI) Hands-on experience in ETL processes (Informatica, Talend, Teradata, Jasper, etc.) Solid understanding of data modeling, data analytics, and business reporting Excellent client management & stakeholder communication skills Strong analytical and problem-solving mindset Bonus Skills (Preferred but Not Mandatory): Experience in sales process automation & CRM analytics Exposure to AI & Machine Learning in sales analytics Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic HR consulting firm dedicated to optimizing human resource management and development. Our mission is to bridge the gap between talent and opportunity, driving growth and success for both our clients and candidates. We foster a culture of collaboration, innovation, and integrity, consistently striving to deliver exceptional service in the evolving landscape of human resources. Role Responsibilities Design, develop, and implement ETL processes using Talend. Collaborate with data analysts and stakeholders to understand data requirements. Perform data cleansing and transformation tasks. Optimize and automate existing data integration workflows. Monitor ETL jobs and troubleshoot issues as they arise. Conduct performance tuning of Talend jobs for efficiency. Document ETL processes and maintain technical documentation. Work closely with cross-functional teams to support data needs. Ensure data integrity and accuracy throughout the ETL process. Stay updated with Talend best practices and upcoming features. Assist in the migration of data from legacy systems to new platforms. Participate in code reviews to ensure code quality and adherence to standards. Engage in user training and support as necessary. Provide post-implementation support for deployed solutions. Evaluate and implement new data tools and technologies. Qualifications 3+ years of experience as a Talend Developer. Strong understanding of ETL principles and practices. Proficiency in Talend Open Studio. Hands-on experience with SQL and database management. Familiarity with data warehousing concepts. Experience using Java for Talend scripting. Knowledge of APIs and web services. Effective problem-solving skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and accuracy in data handling. Experience with job scheduling tools. Ability to manage multiple priorities and deadlines. Knowledge of data modeling concepts. Experience in documentation and process mapping. Skills: data cleansing,data warehousing,job scheduling tools,problem solving,team collaboration,sql,documentation,digital : talend open studio,talend,data transformation,data modeling,performance tuning,web services,api development,java,apis,data integration,etl processes Show more Show less

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies