Jobs
Interviews

1515 Talend Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.6 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data, Analytics & Specialist Managed Service Experience: 0.6-2 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India Job Description As a Specialist, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: Azure Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data for downstream consumption like Business Intelligence systems, Analytics modeling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title : Data Engineering Subject Matter Expert (SME) Location : Dubai, UAE (Hybrid/Onsite) Experience : 10+ years in Data Engineering and ETL with proven leadership and solution delivery experience Job Summary We are seeking a seasoned Data Engineering SME with strong experience in data platforms, ETL tools, and cloud technologies. The ideal candidate will lead the design and implementation of enterprise-scale data solutions, provide strategic guidance on data architecture, and play a key role in data migration, data quality, and performance tuning initiatives. This role demands a mix of deep technical expertise, project management, and stakeholder communication. Key Responsibilities Lead the design, development, and deployment of robust, scalable ETL pipelines and data solutions. Provide technical leadership and SME support for data engineering teams across multiple projects. Collaborate with cross-functional teams including Data Analysts, BI Developers, Product Owners, and IT to gather requirements and deliver data products. Design and optimize data workflows using tools such as IBM DataStage, Talend, Informatica, and Databricks. Implement data integration solutions for structured and unstructured data across on-premise and cloud platforms. Conduct performance tuning and optimization of ETL jobs and SQL queries. Oversee data quality checks, data governance compliance, and PII data protection strategies. Support and mentor team members on data engineering best practices and agile methodologies. Analyze and resolve production issues in a timely manner. Contribute to enterprise-wide data transformation strategies including legacy-to-digital migration using Spark, Hadoop, and cloud platforms. Manage stakeholder communications and provide regular status reports. Required Skills And Qualifications Bachelor's degree in Engineering, Computer Science, or a related field (MTech in Data Science is a plus). 10+ years of hands-on experience in ETL development and data engineering. Strong proficiency with tools : IBM DataStage, Talend, Informatica, Databricks, Power BI, Tableau. Strong SQL, PL/I, Python, and Unix Shell scripting skills. Experience with cloud platforms like AWS and modern big data tools like Hadoop, Spark. Solid understanding of data warehousing, data modeling, and data migration practices. Experience working in Agile/Scrum environments. Excellent problem-solving, communication, and team collaboration skills. Scrum Master or Product Owner certifications (CSM, CSPO) are a plus (ref:hirist.tech)

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As an Enterprise Snowflake L1/L2 AMS Support, your primary responsibilities will include monitoring and supporting Snowflake data warehouse performance, optimizing queries, and overseeing job execution. You will be tasked with troubleshooting data loading failures, managing access control, and addressing role-based security issues. Additionally, you will be expected to carry out patching, software upgrades, and security compliance checks while upholding SLA commitments for query execution and system performance. To excel in this role, you should possess 2-5 years of experience working with Snowflake architecture, SQL scripting, and query optimization. It would be beneficial to have familiarity with ETL tools such as Talend, Matillion, and Alteryx for seamless Snowflake integration.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Gurugram

Work from Office

Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 2 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: • Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. • Develop and optimize data workflows using PySpark, SQL, and Airflow. • Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. • Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. • Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. • Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: • Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy • Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks • Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) • Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred • ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts • Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling • Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: • Strong problem-solving and analytical skills • Flexible to work on fast-paced and cross-functional priorities • Experience collaborating with AI/ML or GenAI teams is a plus • Good communication and a collaborative, team-first mindset • Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 weeks ago

Apply

5.0 years

2 - 4 Lacs

Hyderābād

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Lead Business Intelligence Engineer Position Summary Lead Business Intelligence Engineer The Business Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Business Intelligence. Specific Responsibilities: This role will be in the Claw Team within Enterprise Data & Corporate Technology (EDCT). The Business Intelligence team maintains the firm’s business intelligence tools and data warehouse. As part of the Claw team, the candidate is responsible for: Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Administrative functions for our tools such as keeping the tool documentation current and handling service requests 24x7 on-call L3 support on a rotational schedule with other team members Participate in user training to increase awareness of Claw Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCC’s ITIL process for incident, change and problem resolution Qualifications/Knowledge/Skills Self-starter, continually striving to improve the teams service offerings and one’s own skillset Must have a problem-solving and innovative mindset to meet a wide variety of challenges Willingness and ability to learn all aspects of our operating model as well as new tools Ability to meet deadlines, goals and objectives Moderate to advanced competency of Windows and Unix-like operating system principles (power user functions) Developed competencies around essential project management, communication (oral, written) and personal effectiveness Working experience covering Microsoft Office tools such as Outlook, Excel, PowerPoint, Visio and Project Good SQL skills and good knowledge of relational databases, specifically, Snowflake Ability to manage agile development cycles within the DTCC SDLC (SDP) methodology Optimize/Tune source streams, queries, PowerBI Dashboards Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately. Leadership Competencies for this level include Accountability: Demonstrates reliability by taking necessary actions to continuously meet required deadlines and goals. Global Collaboration: Applies global perspective when working within a team by being aware of own style and ensuring all relevant parties are involved in key team tasks and decisions. Communication: Articulates information clearly and presents information effectively and confidently when working with others. Influencing: Convinces others by making a strong case, bringing others along to their viewpoint; maintains strong, trusting relationships while at the same time is comfortable challenging ideas. Innovation and Creativity: Thinks boldly and out of the box, generates new ideas and processes, and confidently pursues challenges as new avenues of opportunity. Qualifications Minimum of 5 years of related experience Bachelor's degree preferred equivalent experience Specific Skills and Technologies Minimum of 5 years of related data warehousing work experience 5+ years managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement 5+ years managing distributed teams with an employee/vendor mix 5+ years managing offshore vendors Strong understanding of star/snowflake schemas and data integration methods and tools EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

0 years

8 - 10 Lacs

Chennai

Remote

: ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. : Business Technology ZS’s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation. What you’ll do : Work with business stakeholders to understand their business needs. Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. Clean, filter, and validate data to ensure it meets quality and format standards. Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. Optimize data pipelines and data storage for performance and efficiency. Participate in code reviews and provide meaningful feedback to other team members. Provide technical support and troubleshoot issue(s). What you’ll bring : Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Experience working in the AWS cloud platform. Data engineer with expertise in developing big data and data warehouse platforms. Experience working with structured and semi-structured data. Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. Experience working directly with technical and business teams. Able to create technical documentation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Additional Skills : Experience in data modeling. Certified in AWS platform for Data Engineer skills. Experience with ITSM processes/tools such as ServiceNow, Jira Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow : Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

About Kinaxis: About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® CertifiedTM. Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. About the team: Location Chennai, India About the Team The Senior Technology Consultant team will be responsible for understanding Kinaxis customers’ most pressing business performance challenges and will be committed to helping our customers solve complex issues in their supply chain management practice. The incumbent will work with new and existing customers and provide expert guidance in integrating Kinaxis’ Maestro solution with existing client enterprise systems so that our customers can start to experience immediate value from the product. About the role: What you will do Perform integration configuration – mapping, loading, transforming and validating data required to support our customer’s unique system landscape on moderate to complex projects. Design customized technology solutions to address specific business challenges or opportunities, considering the customer’s technological ecosystem and based on the integration approach (Kinaxis-led vs. customer-led). Assist with the implementation and deployment of technology solutions, including project management, system integration, configuration, testing, and training. Demonstrate knowledge and deep proficiency in both the Kinaxis Integration Platform Suite, Maestro data model, REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications. Drive data management and integration related activities including validation and testing of the solutions. Support deployment workshops to help customers achieve immediate value from their investment. Act as the point person for Kinaxis-led integrations and coach and guide more junior and/or offshore consultants through the tactical deliverables for data integration requirements, ensuring a smooth delivery of the end solution. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Strong integration knowledge especially in extracting and transforming data from enterprise class ERP systems like SAP, Oracle, etc. Experience with ERP solutions such as SAP, Oracle, Infor, MS Dynamics etc. Hands on experience and expertise with ETL tools such as Talend, Informatica, SAP CPI / SAP BTP, OIC, MuleSoft, Apache Hop etc. Technical skills such as SQL, JAVA, JavaScript, Python, etc. Strong understanding of data modelling. Knowledge of Cloud Service Providers like GCP, Azure, AWS and their offerings is an advantage. Experience with configuration of data integration from / to SAP through BAPI / RFC, ABAP Programs, CDS Views, or ODATA is an advantage. What we are looking for Bachelor’s degree in Computer Science, Information Technology, AI/ML or a related field. 8-12 years of relevant experience in business software consulting, ideally in supply chain. Minimum 6 years of experience in data integration across complex enterprise systems. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Excellent communication, presentation, facilitation, time management, and customer relationship skills. Excellent problem solving and critical thinking skills. Ability to work virtually and plan for up to 50% travel. #Senior #LI-KJ Why join Kinaxis?: Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com. Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

About Kinaxis: About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® CertifiedTM. Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. About the team: Location Chennai, India About the team The Technology Consultant team will be responsible for data integration activities throughout the deployment of Kinaxis solutions and be exposed to more complex integration mandates. Through their understanding of Kinaxis customers’ most pressing business performance and supply chain challenges, they will offer guidance in integrating Kinaxis’ Maestro solution with existing client enterprise systems so that our customers can start to experience immediate value from the product. About the role: What you will do Perform integration configuration – mapping, loading, transforming, testing and validating data required to support our customer’s unique system landscape on moderate to complex projects. Participate in the design of customized technology solutions to address specific business challenges or opportunities, considering the customer’s technological ecosystem and based on the integration approach (Kinaxis-led vs. customer-led). Assist with the implementation and deployment of technology solutions, including project management, system integration, configuration, testing, and training. Demonstrate knowledge and proficiency in both the Kinaxis Integration Platform Suite, RR data model, and REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications. Support deployment workshops to help customers achieve immediate value from their investment. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Technologies we use Technical skills such as SQL, R, Java Script, Python, etc. Proven experience with manufacturing planning solutions such as Kinaxis, SAP, Blue Yonder, etc. – strong preference for Kinaxis experience. Proven experience and expertise with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. Experience with connection functionality to SAP (through BAPI / RFC), databases, files, web, SOAP. Experience working with relational databases and JavaScript. What we are looking for Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. 5-7 years of relevant experience in business software consulting, ideally in supply chain or in data integration across enterprise-level systems. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Excellent communication, presentation, facilitation, time management, and customer relationship skills. Excellent problem solving and critical thinking skills. Ability to work virtually and plan for up to 70% travel. #Intermediate#Full-time #LI-KJ Why join Kinaxis?: Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com. Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions.

Posted 2 weeks ago

Apply

6.0 years

2 - 4 Lacs

Durgāpura

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In this role, you will: Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In this role, you will have: Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As a Webmethod Developer, you will be responsible for designing, implementing, and maintaining integration solutions using tools like webMethods and Talend. Your role will involve working with cloud platforms such as AWS, Microsoft Azure, and Salesforce to create seamless integration processes. To be successful in this position, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. You must possess strong technical skills and hands-on experience in WebMethods ESB, MWS, and BPM. Additionally, experience with messaging components like Broker and UM, as well as adapters like MQ and Swift, is preferred. You will be expected to deploy code, maintain the WebMethods environment, and handle backend development in Oracle database. Your responsibilities will also include translating business requirements into functional specifications, designing technical documents, and ensuring the quality of deliverables. Collaboration with interface teams, managing testing phases, and implementing CICD for WebMethods applications are key aspects of this role. You will need to adhere to change management processes, meet code quality standards, and ensure proper source code management. This is a full-time position based in Mumbai, with a notice period of up to 30 days. The work schedule is Monday to Friday, with day shifts. The benefits include Provident Fund. If you have 4 to 7 years of experience in this field and possess the required qualifications and competencies, we encourage you to apply for this opportunity.,

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a Data Engineer/ETL Developer specializing in Talend and Power BI, your primary responsibility will be to study, analyze, and comprehend business requirements within the realm of business intelligence. You will be tasked with providing end-to-end solutions that cater to these requirements efficiently. Your role will involve designing and implementing ETL pipelines that ensure data quality and integrity across various platforms such as Talend Enterprise and Informatica. You will need to proficiently load data from diverse sources including Oracle, MSSql, File system, FTP services, and Rest APIs. Additionally, you will be expected to design and map data models that transform raw data into actionable insights while also developing comprehensive data documentation encompassing algorithms, parameters, and models. Analyzing historical and current data will be crucial for facilitating informed decision-making processes. To enhance the existing business intelligence systems, you will have to make necessary technical enhancements and optimize ETL processes for improved performance. Monitoring ETL jobs and troubleshooting any arising issues will also fall within your purview. In a leadership capacity, you will be required to oversee and guide the team's deliverables, ensuring adherence to best development practices. Your involvement in requirements gathering and analysis will be pivotal, and your ability to lead in such endeavors is essential. In terms of prerequisites, you should ideally possess up to 3 years of overall work experience, with a focus on SQL and ETL, particularly Talend. A minimum of 1 year of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, and TAC is a must. Proficiency in database design and data modeling is also a key requirement, along with hands-on experience in a coding language such as Java or Python. Desirable skills include familiarity with a BI tool like MS Power BI and the ability to leverage Power BI for creating interactive and visually appealing dashboards and reports. Strong analytical skills, effective written and verbal communication abilities, self-motivation, and a results-oriented approach are all qualities that will be highly valued in this role. If you possess advanced problem-solving skills, the capacity to work independently with a high level of accountability, and the ability to navigate complex distributed applications, you are encouraged to apply. Experience in multicultural environments will be an added advantage. This is a full-time, permanent position with a Monday to Friday schedule in the UK shift. The work location is in person.,

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Catalogue Specialist Career Level : D2 Introduction To Role Are you ready to make a significant impact in the world of data management? As a Data Catalogue Specialist, you'll play a crucial role in ensuring data is findable, accessible, and fit for use across various business units. You'll be responsible for capturing metadata and developing the data catalogue, supporting Commercial and Enabling Units business areas. Join us in shaping the future of data governance and management! Accountabilities Support the Data Catalogue Principal to define Information Asset Registers across business areas to help profile information risk/value. Participate in projects to mitigate and control identified priority risk areas. Take responsibility for nominated markets/business areas, develop domain knowledge and leverage internal customer relationships to respond to localized use cases. Act as point of contact for nominated business areas or markets. Support initiatives to enhance the reusability and transparency of our data by making it available in our global data catalogue. Support the capture of user requirements for functionality and usability, and document technical requirements. Work with IT partners to capture metadata for relevant data sets and lineage, and populate the catalogue. Work with data stewards and business users to enrich catalogue entries with business data dictionary, business rules, glossaries. Execute monitoring controls to assure metadata quality remains at a high level. Support catalogue principles and data governance leads for tool evaluation and UAT. Essential Skills/Experience Demonstrable experience of working in a data management, data governance or data engineering domain. Strong business and system analysis skills. Proven experience with Data Catalogue, Search and Automation software (Collibra, Informatica, Talend etc). Ability to interpret and communicate technical information into business language and in alignment with AZ business. Solid understanding of metadata harvesting methodologies and ability to create business and technical metadata sets. Strong engagement, communication and stakeholder management skills, including excellent organisational, presentation and influencing skills. High level of proficiency with common business applications (Excel, Visio, Word, PowerPoint & SAP business user). Desirable Skills/Experience Proven experience of working with Commercial or Finance data and systems (Veeva, Reltio, SAP) and consumption. Domain knowledge of life sciences/pharmaceuticals; manufacturing; corporate finance; or sales & marketing. Experience with data quality and profiling software. Experience of working in a complex, diverse global organisation. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, you'll be part of a global team that drives excellence and breakthroughs. Here, your skills can genuinely impact patients' lives. With a focus on innovation and intelligent risk-taking, we empower every function to run faster and achieve more. Our collaborative environment encourages you to speak up, take initiative, and make your mark. Surrounded by high performers, you'll be inspired to learn and grow while contributing to our digital transformation journey. Ready to take on this exciting challenge? Apply now and become a key player in our dynamic team! Date Posted 17-Jul-2025 Closing Date 23-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Mumbai, Chennai

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Manage and support the Delivery Operations Team by implementing and supporting ETL and automation procedures. Schedule and perform delivery operations functions to complete tasks and ensure client satisfaction. ESSENTIAL FUNCTIONS: Process data conversions on multiple platforms Perform address standardization, merge purge, database updates, client mailings, postal presort. Automate scripts to perform tasks to transfer and manipulate data feeds internal and external. Multitask ability to manage multiple Jobs to ensure timely client deliverability Work with technical staff to maintain and support an ETL environment. Work in a team environment with database/crm, modelers, analysts and application programmers to deliver results for clients. REQUIRED SKILLS: Experience in database marketing with the ability to transform and manipulate data. Experience with Oracle and SQL to automate scripts to process and manipulate marketing data. Experience with tools such as DMexpress, Talend, Snowflake, Sap DQM suite of tools, excel. Experience with Sql Server : Data exports and imports, ability to run Sql server Agent Jobs and SSIS packages. Experience with editors like Notepad++, Ultraedit, or any type of editor. Experience in SFTP and PGP to ensure data security and protection of client data. Experience working with large scale customer databases in a relational database environment. Proven ability to work on multiple tasks at a given time. Ability to communicate and work in a team environment to ensure tasks are completed in a timely manner MINIMUM QUALIFICATIONS: Bachelor's degree or equivalent 5+ years experience in Database Marketing. Excellent oral and written communication skills required.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 8 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Talend DI. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Talend DI. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Responsibilities This is a CONTRACT TO HIRE on-site role for a Data Engineer at Quilytics in Mumbai. The contract will be of 6 months with an opportunity to convert to full time role. As a Data Engineer, you will be responsible for data integration, data modeling, ETL (Extract Transform Load), data warehousing, data analytics, and ensuring data integrity and quality. You will be expected to understanding fundamentals of data flow and orchestrations and design and implement secure pipelines and datawarehouses. Maintaining data integrity and quality is of utmost importance. You will collaborate with the team to design, develop, and maintain data pipelines, data platforms using Cloud ecosystems like GCP, Azure, Snowflake etc. You will be responsible for creating and managing the end-to-end data pipeline using custom scripts in python, R language or any third party tools like Dataflow, Airflow, AWS Glue, Fivetran, Alteryx etc. The data pipelines built will be used for managing various operations from data acquisition, data storage to data transformation and visualization. You will also work closely with cross-functional teams to identify data-driven solutions to business problems and help clients make data-driven decisions. You will also be also expected to help build dashboards or any custom reports in Google sheets or Excel. Basic to mid level proficiency in creating and editing dashboards on at least one tool is a must. Qualifications 2+ of experience in using python language to perform Data Engineering, Data Modeling, Data Warehousing and Data Analytics and ETL (Extract Transform Load) Familiarity with GUI based ETL tools like Azure data factory, AWS Glue, Fivetran, Talend, Pentaho etc. for data integration and other data operations. Strong programming skills in SQL, and/ or R. Python. This is a must-have skill. Experience in designing and implementing data pipelines and data platforms in cloud and on-premise systems Basic to mid level proficiency in data visualization on any of the industry accepted tools like Power BI, Looker studio or Tableau is a plus. Understanding of data integration and data governance principles Knowledge of cloud platforms such as Snowflake, AWS or Azure Excellent analytical and problem-solving skills and good communication and interpersonal skills Bachelor's or Master's degree in Data Science, Computer Science, or a related field

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies