Jobs
Interviews

1529 Talend Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description – BI Analyst- (Senior Engineer/ Lead) We at Pine Labs are looking for those who share our core belief - “Every Day is Game day”. We bring our best selves to work each day to realize our mission of enriching the world through the power of digital commerce and financial services. Role Purpose We are looking for a Sr. BI Analyst / Lead who will be Supporting BI Analysts team in implementation of a new dashboard features and writing complex SQL queries to get the raw data ready for dashboarding usage. Preferred candidate should have analytics mindset to convert raw data into user friendly and dynamic dashboards along with developing Paginated Reports. This is an Individual Contributor position who can lead the team from Technical front. Responsibilities We Entrust You With Participates in peer reviews of Reports/ Dashboards created by Internal team members and ensure high standard as per defined reporting/dashboarding standards. Designing Product thinking, Problem solving, Strategic Orientation Must have expertise on Apache SuperSet BI Tools and SSRS. Excellent skills for SSRS, SSIS and Expert in SQL Scripts. Nice to have, Sound knowledge on AWS QuickSight, Powershell Excellent SQL Scripting for complex queries Proficient in both verbal and non-verbal communication Knowledge in ETL Concept and tools e.g. Talend/SSIS Knowledge in Query Optimization in SQL and Redshift Nice to have, Sound knowledge on Data Warehousing and Data Lake Concepts Understands requirement of a Dashboard/Report from Management stake holders and has analytical view to design dynamic dashboards using any BI Analytics tool Required Skills : TSQL, ANSI SQL, PSQL, SSIS, SSRS, Apache Superset, AWS Redshift, QuickSight Good to have skills: Data Lake concepts Analytical Ability, Business and Merchant requirement understanding What Matters In This Role Apache Superset, AWS QuickSight, SSRS, SSIS for developing Dashboards is preferred Excellent TSQL, ANSI SQL, Data Modeling and Querying from multiple Data Stores is mandatory. Experience on Microsoft SSRS and SSIS is needed for developing Paginated Dashboards Experience- 4- 12 Years What We Value In Our People You take the shot: You Decide Fast and You Deliver Right You are the CEO of what you do: you show ownership and make things happen You own tomorrow: by building solutions for the merchants and doing the right thing You sign your work like an artist: You seek to learn and take pride in the work you do

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 1 Lacs

Hyderabad

Work from Office

Key Responsibilities 1. Incident Management Monitor production systems for issues and respond promptly to incidents. Log, categorize, and prioritize incidents for resolution. Collaborate with development teams to address and resolve issues. Communicate with stakeholders regarding incident status and resolution timelines. expertia.ai+2linkedin.com+2virtusa.com+2tavoq.com 2. Root Cause Analysis (RCA) Conduct thorough investigations to identify the underlying causes of recurring issues. Implement long-term solutions to prevent future occurrences. Document findings and share insights with relevant teams. tech-champion.com+1tealhq.com+1 3. System Monitoring & Performance Optimization Utilize monitoring tools to track system health and performance. Identify and address performance bottlenecks or capacity issues. Ensure systems meet performance benchmarks and service level agreements (SLAs). virtusa.com 4. Release Management & Application Maintenance Assist with the deployment of software updates, patches, and new releases. Ensure smooth transitions from development to production environments. Coordinate with cross-functional teams to minimize disruptions during releases. virtusa.com+1tealhq.com+1tech-champion.com 5. User Support & Troubleshooting Provide end-user support for technical issues. Investigate user-reported problems and offer solutions or workarounds. Maintain clear communication with users regarding issue status and resolution. virtusa.com+1resumehead.com+1 6. Documentation & Knowledge Sharing Maintain detailed records of incidents, resolutions, and system configurations. Create and update operational runbooks, FAQs, and knowledge base articles. Share knowledge with team members to improve overall support capabilities. virtusa.com Essential Tools & Technologies Monitoring & Alerting : Nagios, Datadog, New Relic Log Management & Analysis : Splunk, Elasticsearch, Graylog Version Control : Git, SVN Ticketing Systems : JIRA, ServiceNow Automation & Scripting : Python, Shell scripting Database Management : SQL, Oracle, MySQL cvformat.io+2tealhq.com+2virtusa.com+2 Skills & Competencies Technical Skills Proficiency in system monitoring and troubleshooting. Strong understanding of application performance metrics. Experience with database management and query optimization. Familiarity with cloud platforms and infrastructure.expertia.ai Soft Skills Analytical Thinking : Ability to diagnose complex issues and develop effective solutions. Communication : Clear and concise communication with stakeholders at all levels. Teamwork : Collaborative approach to problem-solving and knowledge sharing. Adaptability : Flexibility to handle changing priorities and technologies. cvformat.io

Posted 1 month ago

Apply

4.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary We are seeking a highly skilled Sr. Data Engineer to understand the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 4 - 7 years Work Location: Hyderabad (Hybrid) Mandatory skills: AWS, Python, SQL, Airflow, DBT Responsibilities Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 4 - 7years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and related frameworks. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML application Skills: airflow,kafka,mongodb,cassandra,talend,snowflake,dbt,postgresql,java,python,data engineering,spark,sql,scala,hadoop,aws,informatica

Posted 1 month ago

Apply

3.0 - 6.0 years

15 - 20 Lacs

Bengaluru

Hybrid

Description: Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest APIs etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment.Enable Skills-Based Hiring No Additional Details Planned Resource Unit : (55)IT_TRUCKS;(11)F/TC - Application Engineer - 3-6 Yrs;Business Intelligence;(Z2)3-6 Years

Posted 1 month ago

Apply

3.0 - 8.0 years

55 - 60 Lacs

Bengaluru

Work from Office

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. Job Summary: At Empower, a Sr. Architect is a mix of leadership position and thought leadership role. A Sr. Architect works with enterprise architects, and both business and IT teams, to align solutions to the technology vision they help create. This role supports enterprise Architects in the development of technology strategies, reference architectures, solutions, best practices and guidance across the entire IT development organization; all the while addressing total cost of ownership, stability, performance and efficiency. The candidate will also be working with Empower Innovation Lab team as the team is experimenting with emerging technologies, such as Generative AI, and Advanced Analytics. In this rapid paced environment, the person must possess a "can-do" attitude while demonstrating a strong work ethic. This person should have a strong aptitude to help drive decisions. He or she will be actively involved in influencing the strategic direction of technology at Empower Retirement. There will be collaboration across all teams including IT Infrastructure, PMO office, Business, and third-party integrators in reviewing, evaluating, designing and implementing solutions. The Architect must understand available technology options and educate and influence technology teams to leverage them where appropriate. The Architect will recognize and propose alternatives, make recommendations, and describe any necessary trade-offs. In some cases, particularly on key initiatives, the Architect will participate on the design and implementation of end-to-end solutions directly with development teams. The ideal candidate will leverage their technical leadership/direction-setting skills with the development organization to be able to prove technical concepts quickly using a variety of tools, methods, & frameworks. Responsibilities: Help Enterprise Architect, work with peer Sr. Architects and more junior resources to define and execute on the business aligned IT strategy and vision. Develop, document, and provide input into the technology roadmap for Empower. Create reference architectures that demonstrate an understanding of technology components and the relationships between them. Design and modernize complex systems into cloud compatible or cloud native applications where applicable. Create strategies and designs for migrating applications to cloud systems. Participate in the evaluation of new applications, technical options, challenge the status quo, create solid business cases, and influence direction while establishing partnerships with key constituencies. Implement best practices, standards & guidance, then subsequently provide coaching of technology team members. Make leadership recommendations regarding strategic architectural considerations related to process design and process orchestration. Provide strong leadership and direction in development/engineering practices. Collaborate with other business and technology teams on architecture and design issues. Respond to evolving and changing security conditions. Implement and recommend security guidelines. Provide thought-leadership, advocacy, articulation, assurance, and maintenance of the enterprise architecture discipline. Provide solution, guidance, and implementation assistance within full stack development teams. Recommend long term scalable and performant architecture changes keeping cost in control. Preferred Qualifications: 12+ years of experience in the development and delivery of data systems. This experience should be relevant to roles such as Data Analyst, ETL (Extract, Transform and Load) Developer (Data Engineer), Database Administrator (DBA), Business Intelligence Developer (BI Engineer), Machine Learning Developer (ML Engineer), Data Scientist, Data Architect, Data Governance Analyst, or a managerial position overseeing any of these functions. 3+ years of experience creating solution architectures and strategies across multiple architecture domains (business, application, data, integration, infrastructure and security). Solid experience with the following technology disciplines: Python, Cloud architectures, AWS (Amazon Web Services), Bigdata (300+TBs), Advanced Analytics, Advance SQL Skills, Data Warehouse systems(Redshift or Snowflake), Advanced Programming, NoSQL, Distributed Computing, Real-time streaming Nice to have experience in Java, Kubernetes, Argo, Aurora, Google Analytics, META Analytics, Integration with 3rd party APIs, SOA & microservices design, modern integration methods (API gateway/web services, messaging & RESTful architectures). Familiarity with BI tools such as Tableau/QuickSight. Experience with code coverage tools. Working knowledge of addressing architectural cross cutting concerns and their tradeoffs, including topics such as caching, monitoring, operational surround, high availability, security, etc. Demonstrates competency applying architecture frameworks and development methods. Understanding of business process analysis and business process management (BPM). Excellent written and verbal communication skills. Experience in mentoring junior team members through code reviews and recommend adherence to best practices. Experience working with global, distributed teams. Interacts with people constantly, demonstrating strong people skills. Able to motivate and inspire, influencing and evangelizing a set of ideals within the enterprise. Requires a high degree of independence, proactively achieving objectives without direct supervision. Negotiates effectively at the decision-making table to accomplish goals. Evaluates and solves complex and unique problems with strong problem-solving skills. Thinks broadly, avoiding tunnel vision and considering problems from multiple angles. Possesses a general understanding of the wealth management industry, comprehending how technology impacts the business. Stays on top of the latest technologies and trends through continuous learning, including reading, training, and networking with industry colleagues. Data Architecture - Proficiency in platform design and data architecture, ensuring scalable, efficient, and secure data systems that support business objectives. Data Modeling - Expertise in designing data models that accurately represent business processes and facilitate efficient data retrieval and analysis. Cost Management - Ability to manage costs associated with data storage and processing, optimizing resource usage, and ensuring budget adherence. Disaster Recovery Planning - Planning for data disaster recovery to ensure business continuity and data integrity in case of unexpected events. SQL Optimization/Performance Improvements - Advanced skills in optimizing SQL queries for performance, reducing query execution time, and improving overall system efficiency. CICD - Knowledge of continuous integration and continuous deployment processes, ensuring rapid and reliable delivery of data solutions. Data Encryption - Implementing data encryption techniques to protect sensitive information and ensure data privacy and security. Data Obfuscation/Masking - Techniques for data obfuscation and masking to protect sensitive data while maintaining its usability for testing and analysis. Reporting - Experience with static and dynamic reporting to provide comprehensive and up-to-date information to business users. Dashboards and Visualizations - Creating d ashboards and visualizations to present data in an intuitive and accessible manner, facilitating data-driven insights. Generative AI / Machine Learning - Understanding of generative artificial intelligence and machine learning to develop advanced predictive models and automate decision-making processes. Understanding of machine learning algorithms, deep learning frameworks, and AI model architectures. Understanding of ethical AI principles and practices. Experience implementing AI transparency and explainability techniques. Knowledge of popular RAG frameworks and tools (e.g., LangChain, LlamaIndex). Familiarity with fairness metrics and techniques to mitigate bias in AI models. Sample technologies: Cloud Platforms – AWS (preferred) or Azure or Google Cloud Databases - Oracle, Postgres, MySQL(preferred), RDS, DynamoDB(preferred), Snowflake or Redshift(preferred) Data Engineering (ETL, ELT) - Informatica, Talend, Glue, Python(must), Jupyter Streaming – Kafka or Kinesis CICD Pipeline – Jenkins or GitHub or GitLab or ArgoCD Business Intelligence – Quicksight (preferred), Tableau(preferred), Business Objects, MicroStrategy, Qlik, PowerBI, Looker Advanced Analytics - AWS Sagemaker(preferred), TensorFlow, PyTorch, R, scikit learn Monitoring tools – DataDog(preferred) or AppDynamics or Splunk Bigdata technologies – Apache Spark(must), EMR(preferred) Container Management technologies – Kubernetes, EKS(preferred), Docker, Helm Preferred Certifications: AWS Solution Architect AWS Data Engineer AWS Machine Learning Engineer AWS Machine Learning EDUCATION: Bachelor’s and/or master’s degree in computer science or related field (information systems, mathematics, software engineering) . We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law.

Posted 1 month ago

Apply

5.0 years

7 - 8 Lacs

Hyderābād

On-site

JOB DESCRIPTION: We are seeking a skilled Data Engineer with over 5+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Responsibilities Key Responsibilities: Exp: 5+ years Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer (GCP), Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Requirements Required Qualifications: 5+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Nice to have Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation. We offer Opportunity to work on bleeding-edge projects Work with a highly motivated and dedicated team Competitive salary Flexible schedule Benefits package - medical insurance, sports Corporate social events Professional development opportunities Well-equipped office About us Grid Dynamics (NASDAQ: GDYN) is a leading provider of technology consulting, platform and product engineering, AI, and advanced analytics services. Fusing technical vision with business acumen, we solve the most pressing technical challenges and enable positive business outcomes for enterprise companies undergoing business transformation. A key differentiator for Grid Dynamics is our 8 years of experience and leadership in enterprise AI, supported by profound expertise and ongoing investment in data, analytics, cloud & DevOps, application modernization and customer experience. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the Americas, Europe, and India.

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderābād

Remote

Hyderabad, India Chennai, India Job ID: R-1052028 Apply prior to the end date: July 31st, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. At our core, we are dedicated to enriching lives by bridging the gap between individuals and premium wireless experiences that not only meet but exceed expectations in value and quality. We believe that everyone deserves access to seamless, reliable, and affordable wireless solutions that enhance their day-to-day lives, connecting them to what matters most. By joining our team, you'll play a pivotal role in this mission, working towards delivering innovative, customer-focused solutions that open up a world of possibilities. We're not just in the business of technology; we're in the business of connecting people, empowering them to explore, share, and engage with the world around them in ways they never thought possible. Building on our commitment to connect people with quality experiences that offer the best value in wireless, let's delve deeper into how we strategically position our diverse portfolio to cater to a broad spectrum of needs and preferences. Our portfolio, comprising 11 distinct brands, is meticulously organized into five families, each designed to address specific market segments and distribution channels to maximize reach and impact. Total by Verizon & Verizon Prepaid : At the forefront, we have Total by Verizon and Verizon Prepaid, our flagship brands available at Verizon exclusive and/or national/retail stores. Verizon Prepaid continues to maintain a robust and loyal consumer base, while Total by Verizon is on a rapid ascent, capturing the hearts of more customers with its compelling offerings. Straight Talk, TracFone, and Walmart Family Mobile : Straight Talk, Tracfone, and Walmart Family Mobile stand as giants in our brand portfolio, boasting significant presence in Walmart. Their extensive reach and solidified position in the market underscore our commitment to accessible, high-quality wireless solutions across diverse retail environments. Visible : Visible, as a standalone brand family, caters to the digitally-savvy, single-line customers who prefer streamlined, online-first interactions. This brand is a testament to our adaptability, embracing the digital evolution of customer engagement. Simple Mobile : Carving out a niche of its own, Simple Mobile shines as the premier choice among authorized resellers. Its consistent recognition as the most carried brand in Wave7 Research’s prepaid dealer survey for 36 consecutive quarters speaks volumes about its popularity and reliability. SafeLink : SafeLink remains dedicated to serving customers through government subsidies. With a strategic pivot towards Lifeline in the absence of ACP, SafeLink continues to fulfill its mission of providing essential communication services to those in need. Join the team that connects people with quality experiences that give them the best value in wireless. What You’ll Be Doing: Identifying macro trends, explain drivers behind favorability/unfavorability to targets and help build narratives around disconnect/revenue performance. Responsible for assisting in how the Value Base Management tracks and forecast customer behavior that will result in a prepaid phone disconnect across all Brands as well as Revenue growth. Contributing to the production of Value Disconnects forecast for Best View (monthly), Line-ofSight (weekly), Outlook (quarterly), and Long Range Plan (annually). Contributing to the production of Revenue Growth forecast, Step Rations, Plan Mix, Add on revenue. Developing and streamlining consolidation of forecast models to produce executive friendly slides focused on disconnect drivers Familiarizing yourself with Value’s data infrastructure and various data reporting source the team leverages. Tracking and investigate actual vs forecast variances to determine variance driver. Iterating on process improvements and automations that help us become more nimble, proficient, and data-driven in our decision-making. Integrating data from multiple sources into timely, accessible, and relevant reports, ensuring data quality and reliability. Assisting with ad-hoc projects / requests from senior leaders. What we’re looking for... You’ll need to have: Bachelor's degree or four or more years of work experience. Four or more years of relevant work experience. Experience working with complex data structures. Experience utilizing query tools such as SQL. Experience with forecasting data visualization tools such as Tableau, Qlik, Looker. Experience with streamline and automation in Google Sheets and Microsoft Excel. Experience with large data sets in Google Sheets and Microsoft Excel. Data Analytics Experience. Even better if you have one or more of the following: Advanced MS Excel skills with a deep understanding of model architecture, formula efficiency, pivot tables, and macros. Experience with forecasting data visualization tools such as Tableau, Qlik, Looker. Experience with ETL Tools (Knime, Talend, SSIS, etc). Data Mining experience. Data Modeling experience. Data Science background. #VALUESNONCDIO Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Senior Experience Specialist Save Miami, Florida, +2 other locations Product Retailer Performance Manager Save Denver, Colorado Sales Senior Client Partner-VCG Channel Management Save Basking Ridge, New Jersey, +7 other locations Sales Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Haryana

On-site

CURRENT EMPLOYEES, CONSULTANTS, AND AGENCY PARTNERS: If you currently work for Brown-Forman, please apply by clicking the Careers icon on the Workday portal. For best results, use Google Chrome to view this page. The Senior Data Engineer will lead the design, development, and optimization of data architectures and pipelines. This role involves mentoring junior engineers and collaborating with cross-functional teams to deliver high-quality data solutions that drive business insights and informed decision-making. Lead the development and maintenance of scalable data architectures and pipelines. Design and implement data models and schemas to support business intelligence and analytics. Optimize data processing workflows for performance, scalability, and reliability. Mentor and guide junior data engineers, providing technical leadership and best practices. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity through rigorous testing and validation processes. Execute steady state operating and monitoring procedures for our data warehouse and periodic 24x7 on-call support as necessary What you bring to the table: Bachelor’s Degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or a related field. Advanced proficiency in SQL and experience with relational and NoSQL databases. Expertise in big data technologies (e.g., Hadoop, Spark) and data pipeline tools (e.g., Apache Nifi, Airflow). Strong programming skills in languages such as Python, Java, or Scala. Experience with data warehousing solutions such as Redshift, BigQuery, Snowflake, or similar. Experience with SAP Business Warehouse (BW) and ABAP Strong problem-solving abilities and attention to detail. Fluent in English with excellent communication skills and the ability to work effectively in a collaborative environment. What Makes You Unique Experience with cloud-based data solutions (e.g., AWS, Azure, GCP). Knowledge of ETL tools (e.g., Talend, Boomi, Informatica). Familiarity with data governance and data security best practices. Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Requisition Type: Employee Management Level: Professional Global Job Level: P6 Number of Openings Available: 0

Posted 1 month ago

Apply

55.0 years

7 - 9 Lacs

Chennai

Remote

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your role You act as a contact person for our customers and advise them on data-driven projects. You are responsible for architecture topics and solution scenarios in the areas of Cloud Data Analytics Platform, Data Engineering, Analytics and Reporting. Experience in Cloud and Big Data architecture. Responsibility for designing viable architectures based on Microsoft Azure, AWS, Snowflake, Google (or similar) and implementing analytics. Experience in DevOps, Infrasturcure as a code, DataOps, MLOps. Experience in business development (as well as your support in the proposal process). Data warehousing, data modelling and data integration for enterprise data environments. Experience in design of large scale ETL solutions integrating multiple / heterogeneous systems. Experience in data analysis, modelling (logical and physical data models) and design specific to a data warehouse / Business Intelligence environment (normalized and multi-dimensional modelling). Experience with ETL tools primarily Talend and/or any other Data Integrator tools (Open source / proprietary), extensive experience with SQL and SQL scripting (PL/SQL & SQL query tuning and optimization) for relational databases such as PostgreSQL, Oracle, Microsoft SQL Server and MySQL etc., and on NoSQL like MongoDB and/or document-based databases. Must be detail oriented, highly motivated and work independently with minimal direction. Excellent written, oral and interpersonal communication skills with ability to communicate design solutions to both technical and non-technical audiences. Ideally: Experience in agile methods such as safe, scrum, etc. Ideally: Experience on programming languages like Python, JavaScript, Java/ Scala etc. Your Profile Provides data services for enterprise information strategy solutions - Works with business solutions leaders and teams to collect and translate information requirements into data to develop data-centric solutions. Design and develop modern enterprise data centric solutions (e.g. DWH, Data Lake, Data Lakehouse) Responsible for designing of data governance solutions. What you will love about working here We recognize the significance of flexible work arrangements to provide support . Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Posted 1 month ago

Apply

2.0 - 4.0 years

5 - 7 Lacs

Chennai

On-site

Skill Set: Data Engineering Experience: 2 to 4 years Location: Chennai Employment Type: FTE (Work from office) Notice Period: Immediate to 15 Days About Role: As a ETL developer with hands-on experience in data integration, development, and support of ETL processes using tools like Pentaho, Informatica, or Talend. This operational role is responsible for providing data analysis and management support. The individual in this position may seek appropriate guidance and advice to ensure the delivery of high-quality outcomes. Qualifications: Bachelor's degree in Computer Science, Information Technology or related field. Job Specifications: 1. Proven experience with ETL tools such as Pentaho, Informatica, or Talend. 2. Strong SQL skills, including writing and debugging complex queries. 3. Basic knowledge of Unix/Linux shell scripting. 4. Experience with database modelling for relational databases. 5. Familiarity with Scrum methodology and Agile practices. 6. Experience with GIT for version control and deployment. 7. Python programming skills are a plus. 8. Strong problem-solving skills and attention to detail. Soft Skills: 1. Excellent communication and teamwork abilities. 2. Ability to work independently and manage multiple tasks. 3. Strong analytical and critical-thinking skills. Job Type: Full-time Pay: ₹500,000.00 - ₹700,000.00 per year Schedule: Day shift Work Location: In person Application Deadline: 07/07/2025 Expected Start Date: 14/07/2025

Posted 1 month ago

Apply

2.0 years

3 - 4 Lacs

Gāndhīnagar

Remote

Remote What We Offer: Canteen Subsidy Night Shift allowance as per process Health Insurance Tuition Reimbursement Work-Life Balance Initiatives Rewards & Recognition What You’ll Be Doing: Coordinate across teams and departments to streamline/automate processes and ensure seamless service delivery. Analyze call center data (voice, chat, email) to identify trends, patterns, and areas for improvement across campaigns. Create compelling data visualizations to communicate findings to both technical and non-technical audiences. Translate complex data sets into actionable recommendations for the operations team, providing clear action items. Develop comprehensive dashboards to provide insights for stakeholders. Conduct proactive and on-demand analysis, recommending solutions for performance improvement. Coach team members and leaders to enhance their leadership and technical skills. Maintain SOPs and documentation processes, ensuring accuracy and consistency. We Expect You To Have: 2+ years of experience in BPO/Call center reporting, with strong Data analysis skills. Technical skills in SQL or DBA tools (ability to read and understand code), advanced Excel (Power Query, Power Pivot), and experience with BI/ETL tools (e.g., Talend). Having knowledge of any BI tools and AI technology. Proven ability to identify opportunities for optimization, automation, and process improvement in reporting. Strong leadership skills with the ability to manage deliverables accurately and on time. Job Title : Reporting Analyst (DA) Location : Gandhinagar Schedule & Shift : 4 PM IST to 1 AM IST

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Pune, Chennai

Work from Office

Perform end-to-end ETL testing to ensure accuracy of data extracted, transformed, loaded Validate data in staging, transformation, and target layers using SQL Execute data reconciliation between source and target systems Identify and report defects, Required Candidate profile Understand validate data mapping documents (source-to-target mapping) Participate in system, integration, UAT testing Support data migration, validation, audit requirements Familiarity with ETL tools Perks and benefits Perks and Benefits

Posted 1 month ago

Apply

3.0 - 8.0 years

13 - 16 Lacs

Chennai

Work from Office

Key Skills required: • 3+ years of experience in Talend & SQL • Experience working with SQL and writing complex queries for data extraction, transformation, and analysis • Solid working experience with Talend Open Studio and writing ETL jobs using standard components • Experience in developing projects using Talend with Data Integration and Cloud Integration • Proficient in handling large datasets and optimizing SQL queries for performance improvement • Experience with SQL-based data transformations, indexing strategies, and query optimization techniques • Strong understanding of relational database concepts, normalization, and data modelling principles • Experience working with Data Quality using Talend Data Stewardship is an added advantage • Proficient in deploying and scheduling jobs in Talend Administrator Centre (TAC) • Experience in writing stored procedures, functions, views, and triggers in SQL • Perform data quality assessments, data modelling, and database-related tasks as required • Flexible to work with business users to translate requirements into system flows, data flows, data mappings, etc. Key roles & responsibilities: • Design, develop, and optimize ETL workflows and data pipelines using Talend • Build and maintain Talend jobs for data integration, transformation, and migration across environments • Develop reusable Talend components and frameworks to improve efficiency and maintainability • Optimize Talend job performance by tuning transformations, parallel processing, and error handling • Collaborate with stakeholders to gather requirements, design dataflows, and implement ETL solutions • Ensure data integrity, validation, and quality checks within Talend workflows • Strong problem-solving skills to debug and enhance Talend jobs for scalability and performance

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Pune

Work from Office

Design and execute ETL test cases to validate data extraction, transformation, and loading Write Python scripts for test automation and data validation tasks Perform source-to-target data mapping and verification Develop reusable test automation Required Candidate profile Collaborate with data engineers, BI developers, and QA leads Validate large datasets using advanced SQL queries Maintain documentation of test cases, Strong proficiency in Python scripting Perks and benefits Perks and Benefits

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Inspire Brands: Inspire Brands is disrupting the restaurant industry through digital transformation and operational efficiencies. The company’s technology hub, Inspire Brands Hyderabad Support Center, India, will lead technology innovation and product development for the organization and its portfolio of distinct brands. The Inspire Brands Hyderabad Support Center will focus on developing new capabilities in data science, data analytics, eCommerce, automation, cloud computing, and information security to accelerate the company’s business strategy. Inspire Brands Hyderabad Support Center will also host an innovation lab and collaborate with start-ups to develop solutions for productivity optimization, workforce management, loyalty management, payments systems, and more. We are looking for a Lead Data Engineer for the Enterprise Data Organization to design, build and manage data pipelines (Data ingestion, data transformation, data distribution, quality rules, data storage etc.) for Azure cloud-based data platform. The candidate will require to possess strong technical, analytical, programming and critical thinking skills. Duties and Responsibilities: Work closely with Product owners/managers to understand the business needs Collaborate with architects, data modelers and other team members to design technical details of data engineering jobs/processes to fulfillment business needs Lead sessions with data engineers and provide necessary guidance to solve technical challenges Develop new and/or enhance existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components Actively contribute towards setting up and refining data engineering best practices Help data engineers adhere to coding standards, best practices, etc. and produce production deployable code that is robust, scalable and reusable Support and troubleshoot the data environments, tune performance, etc. Document technical artifacts for developed solutions Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships. Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills Familiar with Agile practices and methodologies Education Requirements Min / Preferred Education Level Description: Minimum 4 Year / Bachelor’s Degree A bachelor's degree in computer science, data science, information science or related field, or equivalent work Years Of Experience Minimum Years of Experience Maximum Years of Experience Comments: 8-10 years of experience in a Data Engineering role Knowledge, Skills, and Abilities: Advanced SQL queries, scripts, stored procedures, materialized views, and views (5+ yrs experience) Focus on ELT to load data into database and perform transformations in database (5+ yrs experience) Ability to use analytical SQL functions Snowflake experience a plus. Experience building dimensional Data marts, Data lakes and warehouses. (5+ yrs experience) Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming – (3+ years' experience with one or more cloud platforms) Experience with DevOps models utilizing a CI/CD tool (2+ yrs experience) Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Talend, Apache Airflow or TWS, Azure Data Factory, and BI tools like Tableau preferred (3+ years experience with one or more) Analyze data models Equal Employment Opportunity Policy EEO-1 Statement It is the policy of Inspire Brands Inc.™ (“IRB” or the “Company”) to treat all employees and applicants for employment fairly and to provide equal employment opportunities without regard to race, color, sex, religion, national original or ancestry, ethnicity, sexual orientation, gender identity, age, disability, genetic information, citizenship, military service or veteran status, marital status or any other characteristic protected under applicable federal, state, or local law. This policy applies to all employment practices including recruiting, hiring, placement, pay, promotions, transfers, training, leaves of absence, and termination. Inspire Brands, Inc. expressly prohibits any form of unlawful employment harassment based on race, color, sex, religion, national original or ancestry, ethnicity, sexual orientation, gender identity, age, disability, genetic information, citizenship, military service or veteran status, marital status or any other characteristic protected under applicable federal, state, or local law. Improper interference with the ability of IRB’s employees to perform their expected job duties will not be tolerated.

Posted 1 month ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Chennai, Mumbai (All Areas)

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary We are looking for a Data Engineering QA Engineer who will be responsible for testing, validating, and ensuring the quality of our data pipelines, data transformations, and analytics platforms. The role involves creating test strategies, designing test cases, and working closely with Data Engineers to ensure the accuracy, integrity, and performance of our data solutions. Key Responsibilities: Data Pipeline Testing : Test and validate data pipelines (ETL/ELT processes) to ensure accurate data movement, transformation, and integration across different platforms. Data Quality Assurance : Define and implement data quality checks, perform exploratory data testing, and monitor data for accuracy and consistency. Test Automation : Design and implement automated testing strategies for data validation using frameworks/tools like PyTest, SQL queries, or custom scripts. Collaboration : Work closely with Data Engineers, Data Analysts, and Product Managers to understand requirements and deliver test plans and strategies aligned with data engineering processes. Performance Testing : Analyze and test the performance and scalability of large-scale data solutions to ensure they meet business requirements. Defect Management : Identify, track, and resolve data quality issues and bugs, working with teams to ensure timely resolution. Compliance : Ensure that data engineering solutions comply with data governance, privacy, and security standards. Reporting : Generate testing reports and provide insights into data quality and system performance. Required Skills & Experience: Proven Experience : 3-5 years of experience as a QA Engineer, Data Engineer, or similar role in data-focused environments. Strong SQL Skills : Proficiency in writing complex SQL queries to validate and test data. ETL/ELT Experience : Familiarity with ETL/ELT tools and processes like DBT, Apache Airflow, Talend, Informatica, etc. Automation Frameworks : Experience with test automation frameworks and tools such as PyTest, Robot Framework, or similar. Cloud Platforms : Knowledge of cloud services (AWS, GCP, Azure) and tools like Redshift, BigQuery, Snowflake, or Databricks. Programming : Strong scripting and programming skills in Python, Java, or a similar language. Data Warehousing : Understanding of data warehousing concepts and best practices for data validation. Version Control : Experience using version control tools (e.g., Git) for code and testing artifacts. Agile Environment : Experience working in Agile/Scrum teams and knowledge of CI/CD pipelines. Attention to Detail : Meticulous when it comes to data validation, ensuring data accuracy and quality at every step. Nice to Have: Big Data Experience : Exposure to big data tools such as Hadoop, Spark, or Kafka. Data Governance & Compliance : Familiarity with GDPR, CCPA, or other data privacy regulations. BI Tools : Experience working with BI tools like Tableau, PowerBI, or Looker. Certification : AWS/GCP Data Engineering or QA certifications. Education: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field.

Posted 1 month ago

Apply

10.0 - 15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Director – Data Lake & Warehousing Presales Architect Location: Greater Noida Experience Required: 10-15 years Role Overview: We are seeking a highly skilled and experienced professional to lead and support our data warehousing and data center architecture initiatives. The ideal candidate will have deep expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance , with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake . This role demands strong presales experience , technical leadership, and the ability to manage complex enterprise deals across multiple geographies. Key Responsibilities: Architect and design scalable Data Warehousing and Data Lake solutions Lead presales engagements, including RFP/RFI/RFQ lifecycle management Create and present compelling proposals and solution designs to clients Collaborate with cross-functional teams to deliver end-to-end solutions Estimate efforts and resources for customer requirements Drive Managed Services opportunities and enterprise deal closures Engage with clients across MEA, APAC, US, and UK regions Ensure alignment of solutions with business goals and technical requirements Maintain high standards of documentation and presentation for client-facing materials Must-Have: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field Certifications in AWS, Azure, GCP, or Snowflake are a plus Experience working in consulting or system integrator environments Strong knowledge of Data Warehousing, Data Lakes, Data Integration, and Data Governance Hands-on experience with ETL tools (e.g., Informatica, Talend, etc.) Exposure to c loud environments: AWS, Azure, GCP, Snowflake Minimum 2 years of presales experience with understanding of presales operating processes Experience in enterprise-level deals and Managed Services Proven ability to handle multi-geo engagements Excellent presentation and communication skills Strong understanding of effort estimation techniques for customer requirements

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 1 month ago

Apply

0.0 - 70.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Job Description Digital Finance Manager – India The ideal candidate will act as the bridge between Finance and IT, bringing hands-on expertise in tools like SAP, Power BI, Alteryx, and RPA platforms, and will play a pivotal role in identifying and delivering finance automation projects aligned with business needs. Purpose of the Role To drive the Digital Finance India Agenda, aligned with Mondelez India SP, by: Bringing in best-in-class business practices, Evaluating digital technologies, Engaging finance and business stakeholders, Driving automation and simplification of financial processes, Enabling future-ready finance operations with minimum manual intervention. Role Overview Acts as a bridge between Finance sub-functions and IT Services. It would also be responsible to identify opportunities, find solutions and its implementations for various processes which are inter-twined between Finance and other functions. You will be responsible for ensuring that Finance IBS projects are successfully delivered on time and on budget. This includes project governance, budget and timeline development, build quality, testing and operational readiness, and the completed project’s readiness to go live; work with project resources to provide design collateral and to configure software components so they are aligned with security policy and governance; and ensure adherence to development and configuration standards and processes. Focuses on identifying automation opportunities across finance processes—especially those that are currently manual (e.g., cash flow statements, reconciliation, reporting). Leads and governs end-to-end project delivery within time and budget (including testing, design, rollout readiness). Drives process redesign and software configuration aligned with security and compliance standards. Important Note : This is not a pure IT role. It requires strong finance acumen and the ability to understand financial reporting, controls, compliance, and analysis needs while embedding digital solutions. Key Accountabilities Develop and implement short, medium, and long-term digital strategies for Finance India. Identify, evaluate, and implement finance automation opportunities (internal + external). Deliver data transformation, automation, visualization, and dashboarding solutions. Manage digital finance projects , ensuring timelines, budgets, and stakeholder expectations are met. Evaluate current finance processes to identify areas for automation, controls improvement, and simplification. Implement new digital tools to improve efficiency and competitiveness. Train finance teams on emerging tools and technologies. Be the go-to digital expert within Finance for process innovation. Collaborate with global and regional stakeholders, including Global Finance Solution Owners and Business Tower leads. Translate business requirements into functional and technical specs. Qualifications & Experience CA or MBA from a reputed university. 8–10 years of progressive experience in finance transformation, with strong focus on analysis, reporting, and forecasting Demonstrated expertise in digital tools relevant to finance, including: SAP (S/4HANA, Hyperion, SAP Analytics Cloud) Power BI, Tableau Robotic Process Automation (RPA) Low-Code/No-Code Platforms Hands-on experience in data engineering and analytics tools, such as: Alteryx, Collibra, Talend, Microsoft platform Exposure to finance transformation or consulting, ideally within the FMCG industry, is a strong plus. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Mondelez India Foods Private Limited (formerly Cadbury India Ltd.) has been in India for over 70 years, making sure our mouth-watering and well-loved local and global brands such as Cadbury chocolates, Bournvita and Tang powdered beverages, Oreo and Cadbury Bournvita biscuits, and Halls and Cadbury Choclairs Gold candies get safely into our customers hands—and mouths . Headquartered in Mumbai, the company has more than 3,300 employees proudly working across sales offices in New Delhi, Mumbai, Kolkata and Chennai and in manufacturing facilities at Maharashtra, Madhya Pradesh, Himachal Pradesh and Andhra Pradesh, at our global Research & Development Technical Centre and Global Business Hub in Maharashtra and in a vast distribution network across the country. We are also proud to be recognised by Avatar as the Best Companies for Women in India in 2019 – the fourth time we’ve received this award. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Finance Planning & Performance Management Finance

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Details: Job Description Exp: 5+ We need to look for someone who have 5+ years of hands on exp. If candidate have other knowledge like Databricks will be added advantage The Talend developer role is responsible to designing, developing and maintaining scalable ETL solutions using Talend platform in order to improve data quality of our CRM eco-system applications and reduce manual data processing. This role is key part of the HARMONIA project team while the engagement is active. It will fully support the Talend developments and software testing as main contribution to the team. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such as unit testing, integration/business user testing and operational support during the engagement. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/she must guarantee an overall excellent co-operation with all Digital members, business representatives and external experts if applicable. Job Requirements Details: Responsible for development of Talend jobs and configuration Responsible for delivering tested, validated deployable jobs to production environments by following Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions and assist Solutions Architect with input/feedback for those requirements wherever deemed necessary. These are to be done by actively participating brainstorming sessions arranged by the Project Manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, Talend Solutions Architect, other developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced jobs follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Apply best practices in error handling, job orchestration, performance tuning, and logging. Reuse components and templates to drive consistency and maintainability across integration processes. Monitor production Talend jobs and respond to incidents or failures to ensure operational continuity. Collaborate with SQA and data governance teams to support data validation, cleansing, and quality improvement efforts. Contribute to sprint planning and agile ceremonies with the Harmonia Project and Data Operations teams. Document ETL logic, data mappings, job configurations, and scheduling dependencies. Perform unit testing and support user acceptance testing (UAT) activities. Actively participate to the project related activities and ensure the SDLC process is followed. No budget responsibility

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 2 Lacs

Hyderabad

Work from Office

3+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. • Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. • Experience with SQL and working with relational databases to extract and manipulate data. • Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. • Strong knowledge of data integration, data quality, and performance optimization in Talend. • Ability to troubleshoot and resolve issues in data integration jobs and processes. • Solid understanding of data modeling concepts and best practices for building scalable data pipelines Role: ETL Production support

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Pune, Chennai

Work from Office

Design and implement automated test cases for ETL processes and data pipelines Perform data validation, data transformation, and reconciliation testing Write and execute complex SQL queries to validate source-to-target data mappings Required Candidate profile Work closely ETL developers, business analysts, QA teams Log, track, and report defects using tools like JIRA, HP ALM, or TestRail Support regression testing, UAT, and performance testing for ETL jobs Perks and benefits Perks and Benefits

Posted 1 month ago

Apply

3.0 - 6.0 years

17 - 18 Lacs

Bengaluru

Hybrid

Hi all , we are looking for a role ETL Developer cum Java Support Engineer experience : 3 - 6 years notice period : Immediate - 15 days location : Bengaluru Description: ETL Developer cum Java Support Engineer Job Summary: We are seeking a versatile professional who can seamlessly blend ETL development expertise with Java-based application support. This hybrid role involves designing and maintaining ETL pipelines while also managing Java support tickets, troubleshooting issues, and ensuring smooth system operations. Key Responsibilities: ETL Development: Design, develop, and maintain ETL workflows to extract, transform, and load data from various sources. Optimize ETL processes for performance, scalability, and reliability. Collaborate with data analysts and business stakeholders to understand data requirements. Ensure data quality, integrity, and compliance with governance standards. Document ETL processes and maintain metadata. Java Support: Monitor and resolve Java application support tickets within defined SLAs. Debug and troubleshoot Java-based backend issues in production and staging environments. Collaborate with development teams to implement bug fixes and enhancements. Perform root cause analysis and provide long-term solutions. Maintain logs, reports, and documentation for support activities. Required Skills: Proficiency in ETL tools (e.g., Informatica, Talend, SSIS). Strong SQL skills and experience with relational databases (Oracle, MySQL, PostgreSQL). Solid understanding of Java and related frameworks (Spring, Hibernate). Familiarity with version control systems (Git) and ticketing tools (JIRA, ServiceNow). Excellent problem-solving and communication skills. Preferred Qualifications: Bachelors degree in Computer Science, Information Systems, or related field. Experience with cloud platforms (AWS, Azure) and data lakes is a plus. Knowledge of data warehousing concepts and data modeling. Enable Skills-Based Hiring No

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Department: Process Simplification and Optimisation Location: India Reports To: Application Development and Support Lead Level: Grade 3 About your team Process Simplification and Optimisation provides groundbreaking solutions to deliver sustainable business value. By powerfully combining both the human and digital workforce, PSO places itself in a leading position to respond and adapt in a continuously evolving ecosystem to achieve Fidelitys strategic goals, deliver best in class service to our clients, and provide exciting growth opportunities for our employees. The Delivery Enablement stream within PSO supports delivery by building capability, providing business process support, ensuring tools and methodologies remain up-to-date with industry technology developments and delivering the ongoing control framework for our new process orchestration, automation and digitisation solutions. About your role This role serves as a member of Fidelity India team under PSO umbrella, supporting Fidelity Clearing Canada (FCC) technology team in a technical support and developer capacity. You will work to embed innovation across the business, maintaining consistency and standards to maximise business benefits. You will ensure the seamless operation of automated workflows, scripts, and orchestration processes for FCC applications currently in production. This includes proactive monitoring, rapid incident resolution, scripting and automation development, collaboration with cross-functional teams, and continuous improvement efforts to enhance system performance, reliability, and security. The goal is to maintain optimal functionality, minimize downtime, and contribute to the overall efficiency of automated systems, aligning with organizational objectives and standards. You will also be responsible for the development, enhancements, and maintenance of application solutions for internal and external clients. You will work to progress Fidelitys PSO and FCC Technology support team agenda by: Application Development Support Ensure that all requests raised by clients and users are handled timely and appropriately by possessing technical knowledge of operating systems, applications, and software development lifecycle. Provide technical support to teams within the organization, and to external clients when required. Update technical documents and procedures to reflect current state. Provide support for application deployments. Assist with systems integration when needed. Collaborate with On-Site Application Development Support Team. Appian Application Support Troubleshoot, fix and enhance the defects raised in Appian based applications; Understand the differences between REST, SOAP and the basic design principles in integrating to Web Services; Debug issues in Interfaces, Process Models and integrations and provide short term/long term solutions. Identify chokepoints and provide design recommendations to enhance the performance of the Application Provide technical guidance to junior developers as and when required; Defect Remediation Remediate defects based on business and client priorities to address service disruptions, incidents, and problems. Client Experience Deliver quality customer service interactions to our internal and external customers to create a positive experience. Take ownership of solving a customers problem promptly; use all available resources to achieve the best outcome. About You Skills and Knowledge Strong technical insight and experience to inform, guide, challenge and support technical decisions. Strong analytical, conceptual, and innovative problem-solving abilities. Strong attention to detail. Ability to work independently while being in a team environment. Excellent communication skills both written and oral; ability to effectively communicate technical material to non-technical users. Goal-oriented and a self-starter. Ability to quickly learn, adapt and change to meet the needs of a changing environment. Ability to explain complex ideas to those with limited IT and systems knowledge. Excellent problem-solving skills. Customer service oriented. Ability to work in a fast-paced environment without direct supervision. Development or Support experience in the Canadian Financial industry is an asset. Track record of actively seeking opportunities for process improvements, efficiency gains, and system optimizations in the context of automation and orchestration Experience and Qualifications Job Related Experience Minimum Requirement: 4+ years Must Have: 3+ years of experience as a developer or programmer/support engineer, including 2+ years of experience in the brokerage securities/Asset Management industry. 2+ Years of Appian BPM (or similar) Hands-On Development Experience 1+ years of experience and intermediate level knowledge of Java/J2EE development, including Spring, Hibernate, MyBatis, JPA, RESTful API, Spring Boot. Strong Hands-On knowledge of SQL and database platforms such as: MySQL, SQL Server, Oracle, database design Handy knowledge of Unix/Linux operating system and Shell scripts Exposure to Automated testing, DevOps, Change Management concepts Experience with Agile Development Methodologies. Nice to Have: Experience with the following: PowerBI, Talend, ETL, Data Warehouse, Control-M uniFide, Salesforce or Dataphile platform would be an asset. Working knowledge of HTML and Adaptive/Responsive Design DocuSign and document management platforms Atlassian stack (JIRA, Confluence) Hands-on expert in creating high performance web applications leveraging React, Angular 2. Some knowledge of concepts such as TypeScript, Bootstrap Grid System, Dependency Injections, SPA (Single Page Application). Experience with cloud-based implementations. Experience in setting up Secure File Transfer Protocols (SFTP) and file delivery. AWS would be an asset. Education: First degree level (Bachelors degree) or equivalent in Computer Science Knowledge of the financial service industry Dynamic Working This role is categorised as Hybrid (Office/Remote).

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies