Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role Overview: We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Key Responsibilities: Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services, including but not limited to AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others, to build and manage data pipelines. Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Required Qualifications: 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools: Extensive hands-on experience with commercial or open-source ETL tools (Talend) Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Strong knowledge of AWS Redshift for data warehousing and analytics. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing . Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills: Strong understanding of relational databases and NoSQL databases. Version Control: Experience with version control systems (e.g., Git). Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: Certifications in AWS Data Analytics or other relevant areas. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities Data Quality Strategy and Implementation: Engage with clients to understand their data quality requirements and business goals. Develop and implement data quality frameworks and solutions using tool such as Collibra and IDMC. Provide expert advice on industry best practices and emerging trends in data quality management. Tool Expertise: Utilize DQ tools such as Collibra, Talend, IDMC, etc. to manage and enhance data quality processes. Configure and customize Collibra workflows and IDMC data management solutions to meet specific client needs. Ensure seamless integration of data quality tools with existing data governance systems. Monitoring and Continuous Improvement: Establish data quality metrics and KPIs to assess effectiveness and drive continuous improvement. Conduct regular audits and assessments to ensure data quality standards are maintained. Facilitate workshops and training sessions to promote data quality awareness and best practices. Collaboration and Leadership: Work collaboratively with data architects, data analysts, IT, legal, and compliance teams to integrate data quality into broader data management initiatives. Mentor and guide junior team members, fostering a culture of knowledge sharing and professional growth. Mandatory Skill Sets Collibra, Informatica Data Management Cloud (IDMC) Preferred Skill Sets certifications in Collibra and Informatica Years Of Experience Required 4 – 7 yrs Education Qualification B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Collibra Data Governance Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Presidio, Where Teamwork and Innovation Shape the Future At Presidio, we’re at the forefront of a global technology revolution, transforming industries through cutting-edge digital solutions and next-generation AI. We empower businesses—and their customers—to achieve more through innovation, automation, and intelligent insights. The Role Presidio Senior Engineer will be responsible for driving the development of reliable, scalable, and high-performance data systems. This role requires a strong foundation in cloud platforms, data engineering best practices, and data warehousing. The ideal candidate has hands-on experience in building robust ETL/ELT pipelines Responsibilities Include Design, develop, and maintain scalable ETL/ELT data pipelines for batch and real-time data processing. Build and optimise cloud-native data platforms and data warehouses (e.g., Snowflake, Redshift, BigQuery). Design and implement data models, including normalised and dimensional models (star/snowflake schema). Collaborate with cross-functional teams to gather requirements and deliver reliable data solutions. Ensure data quality, consistency, governance, and security across data platforms. Optimise and tune SQL queries and data workflows for performance and cost efficiency. Lead or mentor junior data engineers and contribute to team-level planning and design. Must-Have Qualifications Cloud Expertise: Strong experience with at least one cloud platform (AWS, Azure, or GCP). Programming: Proficiency in Python, SQL, and shell scripting. Data Warehousing & Modeling: Deep understanding of warehousing concepts and best practices. ETL/ELT Pipelines: Proven experience with building pipelines using orchestration tools like Airflow or DBT. Experience with CI/CD tools and version control (Git). Familiarity with distributed data processing and performance optimisation. Good-to-Have Skills Hands-on experience with UI-based ETL tools like Talend, Informatica, or Azure Data Factory. Exposure to visualisation and BI tools such as Power BI, Tableau, or Looker. Knowledge of data governance frameworks and metadata management tools (e.g., Collibra, Alation). Experience in leading data engineering teams or mentoring team members. Understanding of data security, access control, and compliance standards (e.g., GDPR, HIPAA). Your future at Presidio Joining Presidio means stepping into a culture of trailblazers—thinkers, builders, and collaborators—who push the boundaries of what’s possible. With our expertise in AI-driven analytics, cloud solutions, cybersecurity, and next-gen infrastructure, we enable businesses to stay ahead in an ever-evolving digital world. Here, your impact is real. Whether you're harnessing the power of Generative AI, architecting resilient digital ecosystems, or driving data-driven transformation, you’ll be part of a team that is shaping the future. Ready to innovate? Let’s redefine what’s next—together. About Presidio At Presidio, speed and quality meet technology and innovation. Presidio is a trusted ally for organizations across industries with a decades-long history of building traditional IT foundations and deep expertise in AI and automation, security, networking, digital transformation, and cloud computing. Presidio fills gaps, removes hurdles, optimizes costs, and reduces risk. Presidio’s expert technical team develops custom applications, provides managed services, enables actionable data insights and builds forward-thinking solutions that drive strategic outcomes for clients globally. For more information, visit www.presidio.com . Presidio is committed to hiring the most qualified candidates to join our amazing culture. We aim to attract and hire top talent from all backgrounds, including underrepresented and marginalized communities. We encourage women, people of color, people with disabilities, and veterans to apply for open roles at Presidio. Diversity of skills and thought is a key component to our business success. Recruitment Agencies, Please Note: Presidio does not accept unsolicited agency resumes/CVs. Do not forward resumes/CVs to our careers email address, Presidio employees or any other means. Presidio is not responsible for any fees related to unsolicited resumes/CVs. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Tips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall. Responsibilities 8+ years of experience in data engineering, with a minimum of 5 years in a leadership role Proficiency in ETL/ELT processes and experience with ETL tools (Talend, Informatica etc.) Expertise in Snowflake or similar cloud-based data platforms (e.g., Redshift, BigQuery) Strong SQL skills and experience with database tuning, data modeling, and schema design Familiarity with programming languages like Python or Java for data processing Knowledge of data governance and compliance standards Excellent communication and project management skills, with a proven ability to prioritize and manage multiple projects simultaneously Location - Gurgaon 3-4 days work from office Meals & Transport free Qualifications Bachelor's OR Master's Degree in IT or equivalent. Excellent verbal and written communication skills Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills: Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less
Posted 1 week ago
10.0 - 19.0 years
8 - 9 Lacs
Thiruvananthapuram
On-site
10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
0 years
0 Lacs
India
On-site
Overview We are seeking a Business Intelligence Analyst to join our team. The ideal candidate will be responsible for analyzing complex data sets to provide insights and support strategic decision-making within the organization. Responsibilities Utilize SQL and Python to extract and manipulate data for analysis Collaborate with stakeholders to gather business requirements and translate them into technical solutions Work in an Agile environment to deliver BI solutions efficiently Perform business analysis to identify trends, patterns, and opportunities Design and optimize databases for efficient data storage and retrieval Analyze linked data sources to create comprehensive reports Monitor data quality and integrity, ensuring accurate reporting Skills Proficiency in SQL for data extraction and manipulation Experience with Python for data analysis and automation Familiarity with Talend or similar ETL tools Knowledge of vaticinate or other predictive analytics tools Understanding of Agile methodologies in BI projects Strong business analysis skills to translate requirements into technical solutions Ability to design effective database structures for BI applications Skill in analyzing linked data sources for comprehensive insights Attention to detail and ability to watch data trends closely Job Types: Full-time, Permanent Pay: From ₹40,000.00 per month Benefits: Health insurance Leave encashment Life insurance Schedule: Day shift Supplemental Pay: Yearly bonus Application Question(s): What is your CTC and Notice Period Education: Bachelor's (Required) Work Location: In person
Posted 1 week ago
5.0 - 8.0 years
2 - 3 Lacs
Chennai
On-site
Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. - Job Family Group: Technology - Job Family: Digital Software Engineering - Time Type: - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
12.0 years
5 - 11 Lacs
Noida
On-site
Overall 12+ years of experience working on Databases, Data Warehouse, Data Integration and BI/Reporting solutions with relevant experience in Lifesciences/Pharma domain. Education : BE/B.Tech/Master of Computer Application Technical: Design and implement effective database solutions and data models to store and retrieve data. Hands on experience in the design of reporting schemas, data marts and development of reporting solutions. Prepare scalable database design and architecture in terms of defining multi-tenants’ schemas, data ingestion, data transformation and data aggregation models. Should have expertise and working experience in at least 2 ETL tools among Informatica, SSIS, Talend & Matillion Should have expertise and working experience in at least 2 DBMS/appliances among Redshift, SQL Server, PostgreSQL, Oracle. Should have strong Data Warehousing, Reporting and Data Integration fundamentals. Advanced expertise with SQL Experience on AWS/Azure cloud data stores and it’s DB/DW related service offerings. Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases. Should have technical expertise and working experience in at least 2 Reporting tools among Power BI, Tableau, ,Jaspersoft and QlikView/QlikSense. Advanced technical Competencies in SQL .
Posted 1 week ago
8.0 years
0 Lacs
Andhra Pradesh, India
On-site
Data Migration Architect More than 8 years of experience with data architecture, large-scale data modelling, database design, and business requirements analysis Data Migration expert and it is must skills, - Work along with the Senior Data Lead / Architect to develop the Migration Framework / scripts Responsible for overall data architecture for all areas and domains of the enterprise, including data acquisition, ODS, data warehouse, data provisioning, and ETL Gather and analyse business data requirements and model these needs Expert level understanding of relational database concepts, dimensional database concepts and database architecture and design, ontology, and taxonomy design Experience with using CA Erwin to develop Enterprise Data Models Set standards for data management, analyse current state and conceive desired future state, and conceive projects needed to close the gap between current state and future goals Strong understanding of the best practices in data modelling techniques including in-depth knowledge of the various normalization, dimensional modelling and their appropriate usage in various solutions Provide guidance on technical specifications, data modelling, and reviews proposed data marts, data models, and other dimensional uses of data within the Data Warehouse The Data Architect will oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality Experience and Knowledge of Talend Data Integration Platform Analyse the account structure, contact, pricing and other related objects and to make sure the required data is moved from source system(s) ( Innovative or cForce) to iQuest Map data attribute(s) and create mapping documents as required Create / Write ETL (Extract transform Load) to read data from source and load data to destination - Data Migration Cleanse ( De-dup , etc) and write transformation logic for data transformation Develop error handling strategy to handle exception / missing values along with the data lead and incorporate them into the scripts Develop roll back mechanism for rinse-and -repeat activities Assist in QA and UAT activities Liaison with the Change management teams as required. Assist in QA and UAT activities Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for Senior Data Migration Specialist with 7+ years of experience in similar role. Roles and responsibilities : • Develop data migration strategies and plans. • Perform ETL operations and data transformation. • Work with cross-functional teams to ensure data consistency and integrity. • Identify and resolve data quality issues. • Document migration processes and best practices. Required Skills: • Expertise in SQL, ETL tools (Informatica, Talend, SSIS, etc.). • Experience in handling large-scale data migrations. • Familiarity with cloud data platforms (AWS, Azure, GCP). Show more Show less
Posted 1 week ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Role & responsibilities Talend Experience: 3-5 years of experience as a Talend developer & Informatica. • Snowflake Experience: 1-3 years of experience working with Snowflake data platform. • ETL Knowledge: Proficient in ETL processes and data integration techniques. • SQL Skills: Strong SQL skills for database querying and management. • Cloud Data Solutions: Experience in cloud data solutions and architecture. Should be available for F2F interview. Its only for Hyderabad location. Preferred candidate profile
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Description : ETL Development Lead (10+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position - Data Engineer Location - Pune Experience - 6+ years Must Have: Tech-savvy engineer - willing and able to learn new skills, track industry trend 6+ years of total experience of solid data engineering experience, especially in open-source, data-intensive, distributed environments with experience in Big data-related technologies like Spark, Hive, HBase, Scala, etc. Programming background – preferred Scala / Python. Experience in Scala, Spark, PySpark and Java (Good to have). Experience in migration of data to AWS or any other cloud. Experience in SQL and NoSQL databases. Optional: Model the data set from Teradata to the cloud. Experience in Building ETL Pipelines Experience in Building Data pipelines in AWS (S3, EC2, EMR, Athena, Redshift) or any other cloud. Self-starter & resourceful personality with the ability to manage pressure situations Exposure to Scrum and Agile Development Best Practices Experience working with geographically distributed teams Role & Responsibilities: Build Data and ETL pipelines in AWS Support migration of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Python Interact with customers on a daily basis to ensure smooth engagement Responsible for timely and quality deliveries. Fulfill organization responsibilities – Sharing knowledge and experience within the other groups in the organization, conducting various technical sessions and training. Education: Bachelor’s degree in computer science, Software Engineering, MIS or equivalent combination of education and experience Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow! Facebook | Twitter | LinkedIn | Instagram | Youtube Show more Show less
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead the design and implementation of data solutions. Optimize and troubleshoot ETL processes. Conduct data analysis and provide insights for decision-making. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling and database design. Experience with data integration and data warehousing concepts. Hands-on experience with SQL and scripting languages. Knowledge of cloud platforms and big data technologies. Additional Information: The candidate should have a minimum of 5 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
12.0 - 17.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer Lead, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. A typical day involves working on data solutions and ETL processes. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead data architecture design. Implement data integration solutions. Optimize ETL processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling. Experience with SQL and database management. Knowledge of cloud platforms like AWS or Azure. Hands-on experience with data warehousing. Good To Have Skills: Experience with data visualization tools. Additional Information: The candidate should have a minimum of 12 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Create data pipelines to extract, transform, and load data across systems. Implement ETL processes to migrate and deploy data across systems. Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: Required Skill:Expert proficiency in Talend Big Data. Strong understanding of data engineering principles and best practices. Experience with data integration and data warehousing concepts. Experience with data migration and deployment. Proficiency in SQL and database management. Knowledge of data modeling and optimization techniques. Additional Information: The candidate should have minimum 5 years of experience in Talend Big Data. Qualification 15 years full time education
Posted 1 week ago
6.0 - 10.0 years
12 - 15 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Skill/Operating Group Technology Consulting Level Manager Location Gurgaon/Mumbai/Bangalore Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Implementing programs/interventions that prepare the organization for implementation of new business processes Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Deep understanding of industry best practices in data governance and management Provide thought leadership to the downstream teams for developing offerings and assets Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data governance, and data management. Certified on DAMA (Data Management) Experience: We are looking for experienced professionals with information strategy, data governance, data quality, data management, and MDM experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. MDM / DQ/ DG Architect - Data Governance & Management SME for areas including Data Quality, MDM, Metadata, data lineage, data catalog. Experience one or more technologies in this space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Enphase Energy is a global energy technology company and leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, Enphase transformed the solar industry with our revolutionary microinverter technology, which turns sunlight into a safe, reliable, resilient, and scalable source of energy to power our lives. Today, the Enphase Energy System helps people make, use, save, and sell their own power. Enphase is also one of the fastest growing and innovative clean energy companies in the world, with approximately 68 million products installed across more than 145 countries. We are building teams that are designing, developing, and manufacturing next-generation energy technologies and our work environment is fast-paced, fun and full of exciting new projects. If you are passionate about advancing a more sustainable future, this is the perfect time to join Enphase! About the role : The Enphase ‘Analyst – Procurement’ will get involved in Claims Process, Component Capacity and Inventory Analysis, Supplier Risk Assessments and Other Procurement related Analytics. This role is to understand existing process in detail and implement RPA model wherever it is applicable. Perform market research on latest process and procedures available with respect to procurement function and automate/Digitize the process. A highly Challenging Job role where you need to Interact with many stake holders and to solve operational issues. You will be part of the Global Sourcing & Procurement team reporting to Lead Analyst. What you will do : Perform detailed analysis on Component Inventory against the Demand, On-Hand & Open Order Qtys: Use advanced data analytics tools like Power BI or Tableau to visualize inventory data. Implement predictive analytics to forecast demand more accurately. Automate the input data consolidation from different Contract Manufacturers: Use ETL (Extract, Transform, Load) tools like Alteryx or Talend to automate data consolidation. Implement APIs to directly pull data from manufacturers' systems. Prepare and submit a monthly STD cost file to finance as per the corporate calendar timelines: Create a standardized template and automate data entry using Excel macros or Python scripts. Set up reminders and workflows in project management tool to ensure timely submission. Work as a program manager by driving component Qualification process working with cross functional teams to get the Qualification completed on time to achieve planned cost savings: Use project management software like Jira to track progress and deadlines. Regularly hold cross-functional team meetings to ensure alignment and address any roadblocks. Finalize the quarterly CBOM (Costed Bill of Materials) and Quote files from all contract manufacturers by following the CBOM calendar timelines: Implement a centralized database to store and manage CBOM data. Use version control systems to track changes and ensure accuracy. Managing Claims management process with Contract Manufacturers and Suppliers: Develop a standardized claims validation process and effectively track & manage claims. Regularly review and update the claims process to improve efficiency. Do market research on new processes & Best Practices on procurement and see how it can be leveraged in the existing process Perform and maintain detailed analysis on Supplier risk assessment with the help of 3rd party vendors: Regularly review and update risk assessment criteria based on changing market conditions. Compile and perform Supplier pricing trend analysis to support Commodity Managers for their QBRs: Create dashboards in BI tools to visualize pricing trends and support decision-making. Work closely with Commodity Managers and identify the Potential or NPI Suppliers to be evaluated for risk assessments: Maintain a database of potential suppliers and their risk assessment results. Maintain & Manage Item master pricing list by refreshing the data on regular intervals without any errors: Use data validation techniques and automated scripts to ensure data accuracy. Implement a regular review process to update and verify pricing data. Who you are and what you bring : Any Bachelor's degree, preferred in Engineering, with minimum 5+ years of experience in Supply Chain Analytics. Should have very good Analytical & Problem-Solving skills. Should have hands on experience on excel based Automations, using MS Power Query, Excel VBA & Gen AI. Should be open minded and should take ownership. Should have strong Verbal Communication and Presentation skills. Strong professional relationship management with internal and external interfaces. Strong interpersonal skills with proven ability to communicate effectively both verbally and in writing with internal customers and suppliers. Ability to perform effectively and independently in a virtual environment. Ability to effectively manage job responsibilities with minimal supervision Show more Show less
Posted 1 week ago
4.0 - 6.0 years
13 - 18 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You’ll Do Take primary ownership in driving both self and team efforts across all phases of the project lifecycle, ensuring alignment with business objectives. Translate business requirements into technical specifications and lead team efforts to design, build, and manage technology solutions that effectively address business problems. Develop and apply advanced statistical models and leverage analytic techniques to utilize data for guiding decision-making for clients and internal teams. Apply appropriate development methodologies (e.g., agile, waterfall) and best practices (e.g., mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely project completion. Partner with project and program leads to deliver projects and assist in project management responsibilities, including project planning, people management, staffing, and risk mitigation. Collaborate with team members globally, ensuring seamless communication, sharing responsibilities, and undertaking tasks effectively. Manage a diverse team of skill sets (programmers, cloud analysts, BI developers, reporting, operations, etc.), mentoring and coaching junior members to enhance their skills and capabilities. Lead task planning and distribution across team members, ensuring timely completion with high quality and providing accurate status reports to senior management. Design custom analyses in programming languages (e.g., R, Python), data visualization tools (e.g., Tableau), and other analytical platforms (e.g., SAS, Visual Basic, Excel) to address client needs. Synthesize and communicate results to clients and internal teams through compelling oral and written presentations. Create project deliverables and implement solutions, while exhibiting a continuous improvement mindset and the capability to learn new technologies, business domains, and project management processes. Guide and mentor Associates within teams, fostering a collaborative environment and enhancing team performance. Demonstrate advanced problem-solving skills, ensuring the team continuously improves its capabilities and approaches to challenges. Exhibit a proactive approach to decision-making, considering the broader picture, especially regarding technical nuances and strategic planning. What You’ll Bring Education Bachelor’s or Master’s degree in Computer Science, Engineering, MIS, or related fields, with strong academic performance, especially in analytic and quantitative coursework. Experience Consulting Industry 4-6 years of relevant consulting experience, ideally in medium-to-large scale technology solution delivery projects Technical Skills 1+ year of hands-on experience in data processing solutions, data modeling, and experience with ETL technologies (e.g., Hadoop, Spark, PySpark, Informatica, Talend, SSIS). Proficiency in programming languages like Python, SQL, Java, Scala, and understanding of data structures. Experience with cloud platforms such as AWS, Azure, or GCP, and exposure to distributed computing. Deep expertise in SQL and data management best practices, with a focus on data analytics and visualization. Consulting/Project Leadership Proven experience leading project teams and managing end-to-end delivery, mentoring team members, and maintaining high standards. Ability to translate complex data and analytics concepts into accessible presentations and frameworks for both technical and non-technical stakeholders. Deep understanding of data management best practices and data analytics methodologies, ensuring high-quality data insights. Effective in a global team environment with a readiness to travel as needed. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 1 week ago
2.0 - 7.0 years
10 - 14 Lacs
Gurugram
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you’ll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 1 week ago
2.0 - 7.0 years
10 - 14 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you'll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2