Home
Jobs
Companies
Resume

804 Talend Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Title – ETL Testing – Python & SQL Candidate Specification – 5+ years, Open for Shift – 1PM to 10 PM. ETL (Python) – all 5 days WFO, ETL (SQL) – Hybrid. Location – Chennai. Job Description Experience in ETL testing or data warehouse testing. Strong in SQL Server, MySQL, or Snowflake. Strong in scripting languages Python. Strong understanding of data warehousing concepts, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling. Proficient in writing SQL queries for data validation and reconciliation. Experience with testing tools such as HP ALM, JIRA, TestRail, or similar. Excellent problem-solving skills and attention to detail. Skills Required RoleETL Testing Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL PYTHON SQL Other Information Job CodeGO/JC/185/2025 Recruiter NameSheena Rakesh Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Title: Data Engineer Candidate Specification: 5 + years, Immediate to 30 days. (All 5 Days work from office for 9 Hours). Job Description Experience with any modern ETL tools (PySpark or EMR, or Glue or others). Experience in AWS, programming knowledge in python, Java, Snowflake. Experience in DBT, StreamSets (or similar tools like Informatica, Talend), migration work done in the past. Agile experience is required with Version One or Jira tool expertise. Provide hands-on technical solutions to business challenges & translates them into process/ technical solutions. Good knowledge of CI/CD and DevOps principles. Experience in data technologies - Hadoop PySpark / Scala (Any one) Skills Required RoleData Engineer Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education B Tech Employment TypeFull Time, Permanent Key Skills PYSPARK. EMR GLUE ETL TOOL AWS CI/CD DEVOPS Other Information Job CodeGO/JC/102/2025 Recruiter NameSheena Rakesh Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. ● Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules ● Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. ● Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. ● Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. ● Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). ● Experience with scripting languages to automate data testing. ● Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills: Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Infosys BPM Limited hiring for Analyst role at Pune location. Analyst - Supplier master data Location - Pune Shift – Rotational Experience – JL4A – 6+ yrs Role/Responsibilities: Due diligence (kd Prevent) Vendor creation, Item creation, incl. validation of vendor particulars. Compliance/maintenance of vendor particulars, risk assessment, Technical Risk, Business Continuity, Limited Souricng Risk, Dependency Risk, Financial Risk, Sustainibilty Risk, Cyber Security Risk. Bank detail verification and update, Compliance check (with KD Prevent), Financial check (BVD database). Supplier code of conduct and anti-bribery questionnaire. Run kdPrevent report, manage result, store on ERP Receive vendor registration supporting documents (including code of conduct and anti-bribery questionnaire response), review, confirm, store on Shared Google at entity level. Create vendor on ERP, using information supplied and validated. MDM Strategy & Data Cleansing Strategy Development, Experience in classification, master data management (material, vendor and Customer master enrichment/cleansing), De-Duplication etc. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. Developing/ participating in new solutions, tools, methodologies, strategies in growing MDM Practice. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Data Design documentation preparation like Data Model, Data Standards, CRUD Matrix, IT System Integration Matrix. Should be able to drive a project implementation from Due diligence to the final signoff from the client and should also maintain the SLAs, on an agreed upon basis. Perform pre-analysis activities such as classifying invoice and PO records to a specified hierarchy, conduct data reviews to ensure high quality and accuracy of reports on which analysis is performed. Project collaboration: Work effectively both independently and as a member of cross-functional teams. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Skillset: Good understanding and work experience of Master Data and its impact on downstream processes. Min. 8 years of professional experience in Master Data Management (key data objects: Customer, Vendor, Material, Product). Fluent verbal and written communication skills, presentation skills. Excellent Data analysis and interpretation skills – proven skills in Excel modeling & analysis. Strong story-telling skills to deliver recommendations from data analysis – proven skills in PowerPoint, in a business-case presentation context. Knowledge and experience of key features of Master Data Management Platforms (Sap ECC, SAP MDG, Talend, Stibo, Collibra, Informatica, Winshuttle, etc.). AI/ ML based projects participation/ experience will be an asset. Fluent verbal and written communication skills, presentation skills. Self-motivated and takes ownership. Project and Team management skills. Skills in ERP, SQL and Data visualization. Interpersonal Skills and thought leadership. Effective communication and maintaining professional relations with the Client and Infosys. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. If interested, please share your updated resume with below details to merlin.varghese@infosys.com Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Current Location: Willing to Work from Office: Flexible with night shifts: Flexible to Relocate: Pune (if any): Regard's Infosys BPM Team. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Team Geek Solutions is a forward-thinking technology services provider dedicated to delivering innovative solutions that drive business success. Situated at the intersection of technology and creativity, we believe in fostering a collaborative environment that encourages growth and development. Our mission is to empower organizations through data-driven strategies and state-of-the-art technology solutions. We value teamwork, integrity, and excellence, ensuring our culture nurtures talent and inspires collective achievement. Job Title: Talend ETL Developer Job Location: Mumbai , Pune Role Responsibilities Design, develop, and maintain ETL processes using Talend. Implement data integration solutions to consolidate data from various systems. Collaborate with business stakeholders to understand data requirements. Develop and optimize SQL queries to extract and manipulate data. Perform data profiling and analysis to ensure data accuracy and quality. Monitor and troubleshoot ETL jobs to ensure smooth data flow. Maintain documentation for ETL processes and data model designs. Work with team members to design and enhance data warehouses. Develop data transformation logic to meet business needs. Ensure compliance with data governance and security policies. Participate in code reviews and contribute to team knowledge sharing. Support data migration initiatives during system upgrades. Utilize Agile methodology for project management and delivery. Manage workflow scheduling and execution of ETL tasks. Provide technical support and training to team members as needed. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as an ETL Developer, mandatory with Talend. Strong understanding of ETL frameworks and data integration principles. Proficient in writing and troubleshooting SQL queries. Experience in data modeling and database design. Familiarity with data quality assessment methodologies. Ability to analyze complex data sets and provide actionable insights. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to work collaboratively in a team-oriented environment. Knowledge of data warehousing concepts and best practices. Experience with Agile development methodologies is a plus. Willingness to learn new technologies and methodologies. Detail-oriented with a commitment to delivering high-quality solutions. Ability to manage multiple tasks and deadlines effectively. Experience with performance tuning and optimization of ETL jobs. Skills: data warehousing,troubleshooting,etl processes,workflow management,etl,data modeling,sql,performance tuning,data profiling and analysis,data governance,data integration,sql proficiency,agile methodology,talend Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Company Overview Team Geek Solutions is a forward-thinking technology services provider dedicated to delivering innovative solutions that drive business success. Situated at the intersection of technology and creativity, we believe in fostering a collaborative environment that encourages growth and development. Our mission is to empower organizations through data-driven strategies and state-of-the-art technology solutions. We value teamwork, integrity, and excellence, ensuring our culture nurtures talent and inspires collective achievement. Job Title: Talend ETL Developer Job Location: Mumbai , Pune Role Responsibilities Design, develop, and maintain ETL processes using Talend. Implement data integration solutions to consolidate data from various systems. Collaborate with business stakeholders to understand data requirements. Develop and optimize SQL queries to extract and manipulate data. Perform data profiling and analysis to ensure data accuracy and quality. Monitor and troubleshoot ETL jobs to ensure smooth data flow. Maintain documentation for ETL processes and data model designs. Work with team members to design and enhance data warehouses. Develop data transformation logic to meet business needs. Ensure compliance with data governance and security policies. Participate in code reviews and contribute to team knowledge sharing. Support data migration initiatives during system upgrades. Utilize Agile methodology for project management and delivery. Manage workflow scheduling and execution of ETL tasks. Provide technical support and training to team members as needed. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as an ETL Developer, mandatory with Talend. Strong understanding of ETL frameworks and data integration principles. Proficient in writing and troubleshooting SQL queries. Experience in data modeling and database design. Familiarity with data quality assessment methodologies. Ability to analyze complex data sets and provide actionable insights. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to work collaboratively in a team-oriented environment. Knowledge of data warehousing concepts and best practices. Experience with Agile development methodologies is a plus. Willingness to learn new technologies and methodologies. Detail-oriented with a commitment to delivering high-quality solutions. Ability to manage multiple tasks and deadlines effectively. Experience with performance tuning and optimization of ETL jobs. Skills: data warehousing,troubleshooting,etl processes,workflow management,etl,data modeling,sql,performance tuning,data profiling and analysis,data governance,data integration,sql proficiency,agile methodology,talend Show more Show less

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and maintain data pipelines for efficient data processing.- Ensure data quality and integrity throughout the data lifecycle.- Implement ETL processes to extract, transform, and load data.- Collaborate with cross-functional teams to optimize data solutions.- Conduct data analysis to identify trends and insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration and ETL processes.- Experience with data modeling and database design.- Knowledge of SQL and database querying languages.- Hands-on experience with data warehousing concepts. Additional Information:- The candidate should have a minimum of 3 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement ETL processes using Talend ETL tool.- Collaborate with cross-functional teams to gather and analyze data requirements.- Optimize and troubleshoot ETL processes for performance and efficiency.- Create and maintain technical documentation for ETL processes.- Assist in testing and debugging ETL processes to ensure data accuracy. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration concepts.- Experience with data modeling and database design.- Knowledge of SQL and database querying.- Familiarity with data warehousing concepts. Additional Information:- The candidate should have a minimum of 2 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :Overall 7+ years of experience In Industry including 4 Years of experience As Developer using Big Data Technologies like Databricks/Spark and Hadoop Ecosystems - Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL - Good understanding of Spark Architecture with Databricks, Structured Streaming. Setting Up cloud platform with Databricks, Databricks Workspace- Working knowledge on distributed processing, data warehouse concepts, NoSQL, huge amount of data processing, RDBMS, Testing, Data management principles, Data mining and Data modellingAs a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data pipelines and data platform components.- Ensure data quality and integrity by implementing data validation and testing procedures. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with Apache Spark and Hadoop.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Proficiency in programming languages such as Python or Java.- Experience with data integration and ETL tools such as Apache NiFi or Talend. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Chennai, Bengaluru, Hyderabad and Pune office. Qualification A Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suite. You will create automation strategy, automated scripts, and support data and environment configuration. Additionally, you will participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Conduct thorough testing of data warehouse ETL processes.- Develop and execute test cases, test plans, and test scripts.- Identify and document defects, issues, and risks.- Collaborate with cross-functional teams to ensure quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of SQL and database concepts.- Experience with ETL tools such as Informatica or Talend.- Knowledge of data warehousing concepts and methodologies.- Experience in testing data integration, data migration, and data transformation processes. Additional Information:- The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :Overall 7+ years of experience In Industry including 4 Years of experience As Developer using Big Data Technologies like Databricks/Spark and Hadoop Ecosystems - Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL - Good understanding of Spark Architecture with Databricks, Structured Streaming. Setting Up cloud platform with Databricks, Databricks Workspace- Working knowledge on distributed processing, data warehouse concepts, NoSQL, huge amount of data processing, RDBMS, Testing, Data management principles, Data mining and Data modellingAs a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data pipelines and data platform components.- Ensure data quality and integrity by implementing data validation and testing procedures. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with Apache Spark and Hadoop.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Proficiency in programming languages such as Python or Java.- Experience with data integration and ETL tools such as Apache NiFi or Talend. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Chennai, Bengaluru, Hyderabad and Pune office. Qualification A Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Apache Spark, Talend ETLMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams, make team decisions, and provide solutions to problems. Engaging with multiple teams, you will contribute to key decisions and provide solutions for your immediate team and across multiple teams. In this role, you will have the opportunity to showcase your creativity and technical expertise in designing and building applications. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications to meet business process and application requirements- Collaborate with cross-functional teams to gather and define application requirements- Develop and implement software solutions using the Databricks Unified Data Analytics Platform- Perform code reviews and ensure adherence to coding standards- Troubleshoot and debug applications to identify and resolve issues- Optimize application performance and ensure scalability- Document technical specifications and user manuals for applications- Stay updated with emerging technologies and industry trends- Train and mentor junior developers to enhance their technical skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark, Apache Spark, Talend ETL- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Preferred Education Master's Degree Required Technical And Professional Expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships) Preferred Technical And Professional Experience Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Saras Analytics: We are an ecommerce focused end to end data analytics firm assisting enterprises & brands in data driven decision making to maximize business value. Our suite of work spans extraction, transformation, visualization & analysis of data delivered via industry leading products, solutions & services. Our flagship product is Daton, an ETL tool. We have now ventured into building exciting ease of use data visualization solutions on top of Daton. And lastly, we have a world class data team which understands the story the numbers are telling and articulates the same to CXOs thereby creating value. Where we are Today: We are a boot strapped, profitable & fast growing (2x y-o-y) startup with old school value systems. We play in a very exciting space which is intersection of data analytics & ecommerce both of which are game changers. Today, the global economy faces headwinds forcing companies to downsize, outsource & offshore creating strong tail winds for us. We are an employee first company valuing talent & encouraging talent and live by those values at all stages of our work without comprising on the value we create for our customers. We strive to make Saras a career and not a job for talented folks who have chosen to work with us. The Role: We are seeking an accomplished Lead Data Engineer with strong programming skills, cloud expertise, and in-depth knowledge of Big Query/Snowflake data warehousing technologies. As a key leader in our data engineering team, you will play a critical role in designing, implementing, and optimizing data pipelines, leveraging your expertise in programming, cloud platforms, and modern data warehousing solutions. Responsibilities: Data Pipeline Architecture: Lead the design and architecture of scalable and efficient data pipelines, ensuring optimal performance and reliability. Programming and Scripting: Utilize strong programming skills, particularly in languages like Python, for developing robust and maintainable data engineering solutions. Cloud Platform Expertise: Apply extensive experience with cloud platforms (e.g., AWS, Azure, Google Cloud) to design, deploy, and optimize data engineering solutions in a cloud environment. BigQuery/Snowflake Knowledge: Demonstrate deep understanding and hands-on experience with BigQuery/Snowflake for efficient data storage, processing, and analysis. ETL Processes: Lead the development of Extract, Transform, Load (ETL) processes, ensuring seamless integration of data from various sources into the data warehouse. Data Modeling and Optimization: Design and implement effective data models to support ETL processes and ensure data integrity and efficiency. Collaboration and Leadership: Collaborate with cross-functional teams, providing technical leadership and guidance to junior data engineers. Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Quality Assurance: Implement comprehensive data quality checks and validation processes to ensure the accuracy and completeness of data. Documentation: Create and maintain detailed documentation for data engineering processes, data models, and cloud configurations. Technical Skills: Programming Languages: Expertise in programming languages, with a strong emphasis on Python. Cloud Platforms: Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud. Big Data Technologies: Proficiency in big data technologies and frameworks for distributed computing. Data Warehousing: In-depth knowledge of modern data warehousing solutions, with specific expertise in BigQuery/Snowflake. ETL Tools: Experience with ETL tools like Apache NiFi, Talend, or similar. SQL: Strong proficiency in writing and optimizing SQL queries for data extraction, transformation, and loading. Collaboration Tools: Experience using collaboration and project management tools for effective communication and project tracking. Soft Skills: Strong leadership and mentoring capabilities. Excellent communication and presentation skills. Strategic thinking and problem-solving abilities. Ability to work collaboratively in a cross-functional team environment. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science Data Engineering, or a related field. Experience: 8+ years of experience in data engineering roles with a focus on programming, cloud platforms, and data warehousing. If you are an experienced Lead Data Engineer with a strong programming background, cloud expertise, and specific knowledge of BigQuery/Snowflake, we encourage you to apply. Please submit your resume and a cover letter highlighting your technical skills, leadership experience, and contributions to data engineering projects. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance. Show more Show less

Posted 4 days ago

Apply

1.0 years

0 Lacs

India

On-site

About Kinaxis: About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® CertifiedTM. Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. About the team: Location Chennai, India (Hybrid) About the team This team is responsible for supporting data integration activities throughout the deployment of Kinaxis solutions. The job incumbent has a foundational level of technical and domain knowledge and can navigate Kinaxis’ technology solutions and processes for data management and integration. The team understands Kinaxis customers’ most pressing supply chain product offerings so that our customers can start to experience the immediate value of Kinaxis solutions About the role: What you will do Participate in deep-dive business requirements discovery sessions and develop integration requirements specifications, with guidance from senior consultants. Demonstrate knowledge and proficiency in both the Kinaxis Maestro (RapidResponse) data model, and REST based API Integration capabilities, and support identifying and implementing solutions best suited to individual data flows, under the guidance of senior consultants. Assist in integration related activities including validation and testing of the solutions. Technologies we use Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. Passion for working in a collaborative team environment and able to demonstrate strong interpersonal, communication, and presentation skills. 1-3 years of experience in implementing or deploying software applications in the supply chain management space or experience in data integration activities for enterprise level systems. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Self-starter who shows initiative in their work and learning and can excel in a fast-paced work environment. Excellent problem-solving and critical thinking skills, able to synthesize a high volume of complex information to determine the best course of action. Works well in a team environment and can work effectively with people at all levels in an organization. Ability to communicate complex ideas effectively in English, both verbally and in writing. Ability to work virtually. What we are looking for Technical skills such as SQL, R, Java Script, Python, etc. Experience working with relational databases and Typescript, an asset. Experience working with Maestro authoring an asset. Experience working with supply chain processes and manufacturing planning solutions such as Maestro, SAP, Oracle, or Blue Yonder applications to support supply chain activities. Progressive experience with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Some database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. #Intermediate #LI-RJ1 #Hybrid Why join Kinaxis?: Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com. Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions.

Posted 4 days ago

Apply

8.0 - 11.0 years

6 - 9 Lacs

Noida

On-site

Snowflake - Senior Technical Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 4 days ago

Apply

7.0 years

3 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Work with large, diverse datasets to deliver predictive and prescriptive analytics Develop innovative solutions using data modeling, machine learning, and statistical analysis Design, build, and evaluate predictive and prescriptive models and algorithms Use tools like SQL, Python, R, and Hadoop for data analysis and interpretation Solve complex problems using data-driven approaches Collaborate with cross-functional teams to align data science solutions with business goals Lead AI/ML project execution to deliver measurable business value Ensure data governance and maintain reusable platforms and tools Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Technical Skills Programming Languages: Python, R, SQL Machine Learning Tools: TensorFlow, PyTorch, scikit-learn Big Data Technologies: Hadoop, Spark Visualization Tools: Tableau, Power BI Cloud Platforms: AWS, Azure, Google Cloud Data Engineering: Talend, Data Bricks, Snowflake, Data Factory Statistical Software: R, Python libraries Version Control: Git Preferred Qualifications: Master’s or PhD in Data Science, Computer Science, Statistics, or related field Certifications in data science or machine learning 7+ years of experience in a senior data science role with enterprise-scale impact Experience managing AI/ML projects end-to-end Solid communication skills for technical and non-technical audiences Demonstrated problem-solving and analytical thinking Business acumen to align data science with strategic goals Knowledge of data governance and quality standards At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #Nic

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: ETL Talend Lead Location: Bangalore, Hyderabad, Chennai, Pune Work Mode: Hybrid Job Type: Full-Time Shift Timings: 2:00 - 11:00 PM Years Of Experience: 8 - 15 years ETL Development Lead: Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Insightsoftware (ISW) is a growing, dynamic computer software company that helps businesses achieve greater levels of financial intelligence across their organization with our world-class financial reporting solutions. At insightsoftware, you will learn and grow in a fast-paced, supportive environment that will take your career to the next level. The Data Conversion Specialist is a member of the insightsoftware Project Management Office (PMO) who demonstrates teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude. Location: Hyderabad (Work from Office) Working Hours: 5:00 PM - 2:00AM IST or 6:00 PM to 3:00 AM IS T, should be ok to work in night shift as per requirement. Position Summary The Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, and organizational skills to ensure the client data is converted efficiently and accurately to the insightsoftware (ISW) platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Qualifications Experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or ISW Project Manager to scope, develop, test, and implement conversion/integration Effectively communicate with ISW Project Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/or ISW Project Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited to: Agile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure client’s data is converted/integrated accurately and within deadlines established by ISW Project Manager. Experience in customer SIT, UAT, migration and go live support. Additional Information All your information will be kept confidential according to EEO guidelines. ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** insightsoftware About Us: Hear From Our Team - InsightSoftware (wistia.com) Background checks are required for employment with insightsoftware, where permitted by country, state/province. At insightsoftware, we are committed to equal employment opportunity regardless of race, color, ethnicity, ancestry, religion, national origin, gender, sex, gender identity or expression, sexual orientation, age, citizenship, marital or parental status, disability, veteran status, or other class protected by applicable law. We are proud to be an equal opportunity workplace. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Non-Degree Program Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance. Show more Show less

Posted 4 days ago

Apply

10.0 - 15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Experience :- 10 - 15 Years. Job Description Lead execution of the assigned projects & responsible for end to end execution Lead, guide and support the design and implementation of targeted strategies including identification of change impacts to people, process, policy, and structure, stakeholder identification and alignment, appropriate communication and feedback loops, success measures, training, organizational readiness, and long-term sustainability Manage the day-to-day activities, including scope, financials (e.g. business case, budget), resourcing (e.g. Full-time employees, roles and responsibilities, utilization), timelines and toll gates and risks Implement project review and quality assurance to ensure successful execution of goals and stakeholder satisfaction Consistently report and review progress to the Program Lead, Steering group and relevant stakeholders Will involve in more than one projects or will work across a portfolio of projects Identify improvement and efficiency opportunities across the projects Analyze data, evaluate results, and develop recommendations and road maps across multiple workstreams Build and maintain effective partnerships with key cross functional leaders and project team members across functions such as Finance & Technology Experience Experience of working as a Project Manager/ Scrum Master as a service provider (not in internal projects) Knowledge of functional supply chain and planning processes, including ERP/MRP, capacity planning, and managing planning activities with contract manufacturers - Good to have. Experience in implementing ERP systems such as SAP and Oracle - good to have. Not mandatory. Experience in systems integration and ETL tools such as Informatica and Talend a plus Experience with data mapping and systems integration a plus Functional knowledge of supply chain or after sales service operations a plus Outstanding drive, excellent interpersonal skills and the ability to communicate effectively, both verbally and in writing, and to immediately contribute in a team environment An ability to prioritize and perform well in a fast-paced environment, while maintaining a high level of client focus Demonstrable track record of delivery and impact in managing/delivering transformation, with minimum 6-9 years’ experience in project management & business transformation Experience in managing Technology Projects(data analysis, visualization, app development etc) along with atleast in one function such as Procurement Domain, process improvement, continuous improvement, change management, operating model design Has performed the role of a scrum master or managed a project having scrum teams Has managed projects with stakeholders in multi-location landscape Past experience in managing analytics projects will be a huge plus Education Understanding & application of Agile and waterfall methodology Exposure to tools and applications such as Microsoft Project, Jira, Confluence, PowerBI, Alteryx Understanding of Lean Six Sigma Preferably a post graduate - MBA though not mandatory Expectation Excellent interpersonal (communication and presentation) and organizational skills · Problem solving abilities and a can-do attitude Confident, proactive self-starters, comfortable in managing and engaging others Effective in engaging, partnering with and influencing stakeholders across the matrix up to VP level Ability to move fluidly between big picture and detail always keeping the end goal in mind Inclination toward collaborative partnership, and able to help establish/be part of high performing teams for impact Highly diligent with close eye for detail. Delivers quality outputs Show more Show less

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies