Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have : AWS Data bricks Good-to-have : PySpark, Snowflake, Talend Requirements Candidate must be experienced working in projects involving Other Ideal Qualifications Include Experiences In Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers (ref:hirist.tech)
Posted 1 month ago
7.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Engineer Azure, ETL, Snowflake. Experience : 7+ yrs. Location : Remote. Job Summary We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT. Key Responsibilities Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake. Write optimized SQL queries, stored procedures, and views to transform and retrieve data. Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc. Work on data mapping, modelling, and transformation tasks across multiple data sources. Build and deploy custom connectors using Python, PySpark, or ADF. Implement and manage Snowflake as a data storage and processing solution. Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub. Ensure smooth cloud migration and data pipeline deployment using Azure services. Work with Fivetran and DBT for ingestion and transformation as required. Participate in Agile/Scrum ceremonies and follow DevSecOps practices. Mandatory Skills & Qualifications 7 years of experience in Data Engineering, ETL development, or similar roles. Proficient in SQL with strong understanding of joins, filters, and aggregations. Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc. Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage. Experience with Azure Cloud Services, specifically : Azure Data Factory (ADF). Databricks. Azure Data Lake. Hands-on experience in Snowflake implementation (ETL or Storage Layer). Familiarity with data modelling, data mapping, and pipeline creation. Experience working with semi-structured/unstructured data formats. Working knowledge of GitHub for version control and code management. Good To Have / Preferred Skills Experience using Fivetran and DBT for ingestion and transformation. Knowledge of AWS or GCP cloud environments. Familiarity with DevSecOps processes and CI/CD pipelines within Azure. Proficiency in Excel and Macros. Exposure to Agile methodologies (Scrum/Kanban). Understanding of custom connector creation using PySpark or ADF. Soft Skills Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Ability to work independently and take ownership of deliverables. Detail-oriented with a commitment to quality. Why Join Us? Work on modern, cloud-based data platforms. Exposure to a diverse tech stack and new-age data tools. Flexible remote working opportunity aligned with a global team. Opportunity to work on critical enterprise-level data solutions. (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Greater Kolkata Area
Remote
Experience : 5 to 8 years. Location : Remote. We are seeking a Senior Informatica ETL Developer with 5 to 8 years of hands-on experience in Informatica development and ETL processes. The ideal candidate will have a strong background in designing, developing, and optimizing ETL workflows, with significant expertise in Amazon Redshift, SQL, and ETL migration projects from tools like Talend or DataStage to Informatica (PowerCenter or IICS). Key Responsibilities Design, develop, and maintain ETL workflows using Informatica PowerCenter and/or Informatica Cloud (IICS). Migrate ETL jobs from Talend or DataStage to Informatica, ensuring performance, accuracy, and maintainability. Write and optimize complex SQL queries, primarily in Amazon Redshift. Collaborate with business and technical teams to gather and understand requirements and deliver data solutions. Work with large volumes of structured and unstructured data, ensuring performance and data integrity. Support and troubleshoot complex ETL jobs and data pipelines. Conduct performance tuning and issue resolution for ETL workflows. Leverage data warehousing and data lake house architecture principles to design robust data integration solutions. Ensure scalability, reliability, and quality of data systems. Required Skills & Qualifications 5+ years of hands-on experience in Informatica ETL development (PowerCenter/IICS). Strong experience in ETL migration from Talend or DataStage to Informatica. Proficiency in Amazon Redshift with solid SQL development and optimization skills. Deep understanding of data warehousing concepts, architectures, and best practices. Familiarity with data lake house architectures and integration strategies. Experience in Cloud Data Platforms (AWS, Azure, GCP) is a plus. Knowledge of additional ETL tools and big data ecosystems is a bonus. Familiarity with Agile/Scrum methodologies. (ref:hirist.tech)
Posted 1 month ago
7.0 - 14.0 years
0 Lacs
Greater Kolkata Area
On-site
Key Responsibilities Develop and optimize complex SQL queries, including joins (inner/outer), filters, and aggregations. Work with diverse datasets from multiple database sources, ensuring data quality and integrity. Leverage Python for data manipulation, including functions, iterations, API requests, and JSON flattening. Use Python to interpret, manipulate, and process data to facilitate downstream analysis. Design, implement, and optimize ETL processes and workflows. Manage data ingestion from various formats (e.g., JSON, Parquet, TXT, XLSX) using tools like Informatica, Teradata, DataStage, Talend, and Snowflake. Demonstrate expertise in Azure services, specifically ADF, Databricks, and Azure Data Lake. Create, manage, and optimize cloud-based data pipelines. Integrate data sources via Fivetran or custom connectors (e.g., PySpark, ADF). Lead the implementation of Snowflake as an ETL and storage layer. Ensure seamless data connectivity, including handling semi-structured/unstructured data. Promote code and manage changes across various environments. Proficient in writing complex SQL scripts, including stored procedures, views, and functions. Hands-on experience with Snowflake in multiple projects. Familiarity with DBT for transformation logic and Fivetran for data ingestion. Strong understanding of data modeling and data warehousing fundamentals. Experience with GitHub for version control and code Skills & Experience : 7 to 14 years of experience in Data Engineering, with a focus on SQL, Python, ETL, and cloud technologies. Hands-on experience with Snowflake implementation and data pipeline management. In-depth understanding of Azure cloud tools and services, such as ADF, Databricks, and Azure Data Lake. Expertise in designing and managing ETL workflows, data mapping, and ingestion from multiple data sources/formats. Proficient in Python for data interpretation, manipulation, and automation tasks. Strong knowledge of SQL, including advanced techniques such as stored procedures and functions. Experience with GitHub for version control and collaborative to Have : Experience with other cloud platforms (e.g., AWS, GCP). Familiarity with DataOps and continuous integration/continuous delivery (CI/CD) practices. Prior experience leading or mentoring teams of data engineers. (ref:hirist.tech)
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Position Summary: Support engineers to support the lower environment in 16X5 shift.. Ideal candidate is with strong Java experience who knows Data Technologies and monitoring tolls mentioned below with AWS exp. Expecting 6-7 resources to be strong in the category and rest ETL and cloud specific Minimum of 4 to 8+ years of related experience. Bachelor’s degree in computer science or related technical field required; master’s preferred and/or equivalent experience. Expertise in supporting technologies such as Java, Web and Application servers, Message Queuing, ETL (Snowflake, DataStage, Informatica, Talend, Jenkins), databases both on premise and Cloud (AWS). Must have a deep understanding of Java application architecture and hands-on experience with application troubleshooting. knowledge of troubleshooting, automation at all software layers (e.g., UI, services, APIs, databases, etc.) Basic working knowledge of Unix, Windows, ability to navigate the Unix servers and analyze the logs, basic Unix / Windows commands handling. Basic working knowledge of executing DB (DML & DDL) scripts and sending outputs whenever required. working experience in DevOps and CI/CD Related Technologies (Jenkins, maven, bit bucket, SonarQube, nexus) Good to have scripting knowledge (any programming language) and thrive to learn, eager to automate with the available tools at the organization. Good to have prior experience in alerting and event monitoring models and technologies such as SNOW, Dynatrace and Splunk Sound knowledge in support for core AWS cloud services (Knowledge in Lambda, ECS, glue services, dockers, container, cloud watch) Has a growth demeanor, including thinking creatively to produce creative solutions when tools do not work, partnering and working hands-on to provide solutions (automation thorough available tools) and optimally working on multiple work streams in a fast-paced environment. Be able to clearly articulate complex technical concepts in a way that is easily understood by recipients of multiple backgrounds. Effective communicator who can be the technical spokesperson in broader architecture and technology discussions. Ability to work on multiple concurrent initiatives. Experience dealing with cross-functional teams. Exceptional collaborator. Leadership Competencies for this level include Accountability: Demonstrates reliability by taking necessary actions to continuously meet required deadlines and goals. Global Collaboration: Applies global perspective when working within a team by being aware of own style and ensuring all relevant parties are involved in key team tasks and decisions. Communication: Articulates information clearly and presents information effectively and confidently when working with others. Influencing: Convinces others by making a strong case, bringing others along to their viewpoint; maintains strong, trusting relationships while at the same time is comfortable challenging ideas. Innovation and Creativity: Thinks boldly and out of the box, generates new ideas and processes, and confidently pursues challenges as new avenues of opportunity. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent work experience). Proven experience in managing and supporting MacBook devices within an enterprise environment. Strong proficiency in macOS, including configuration, troubleshooting, and scripting. In-depth knowledge of Microsoft Intune and its application to macOS device management. Proficiency in administering JAMF Pro for macOS device management. Familiarity with cybersecurity best practices for macOS devices. Excellent problem-solving and communication skills. Certifications in macOS, Microsoft Intune, and JAMF are a plus. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 month ago
5.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
IT Engineer- Automation : Responsible for systems testing. The role includes analysing and reviewing systems requirements with development teams / SMEs to understand project objectives and gathering requirements to design reusable and maintainable functional and automated tests. Exposure to guide and lead a team to meet the objectives. Good in communication Responsibilities: • Participate in test planning, test case writing and test execution using a hybrid approach of automation and manual testing. • Conduct manual and automated test lead activities • Handle the defect lifecycle process, from reporting bugs to closure. • Identify and prepare test data for testing purposes. • Regularly updates the Test/Project Manager on progress and status. • Attend project related meeting as needed. • Maintain documentation for manual testing and automation within the project. • Support User Acceptance Testing (UAT) • Perform other testing-related tasks as assigned by the project. Roles and Responsibilities Technical Skills / Knowledge Required : • Around 5 years of hands-on experience in Automation and Manual testing for medium to large complex projects using Java and Selenium • Proficient in writing test cases and executing them in an Agile environment. • Hands-on experience in Selenium with Java coding for test automation Good knowledge on BDD Framework. • Proficient in testing REST components/APIs for webservices and web applications using tools like Postman. • Hands-on experience in building automated test scripts. • Demonstrated proficiency with SQL for creating/modifying queries for backend testing. • Must have expertise in using defect management tools, processes, and reporting. • Experience in running automation scripts through CI/CD pipelines using tools like Jenkins or similar Personal/Soft Skills: • Excellent written and oral communication skills, with the ability to present analysis of results in a clear and concise manner. • Ability to do presentations and walkthroughs with Systems and Business personnel. • Ability to work well in a fast-paced environment under deadlines in a changing environment. • Must be organized and detail oriented.
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Data Migration Engineer Experience: 4 to 6 years Location: Noida & Bhubaneswar Notice Period: We are looking for Immediate Joiners only. Key Responsibilities: Design and implement end-to-end data migration and integration solutions. Develop and manage ETL pipelines using tools like Talend, Apache NiFi, Informatica, or AWS Glue. Work with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Leverage Google BigQuery for data ingestion, transformation, and performance optimization. Write custom ETL logic using SQL, Python, or Shell scripting. Ensure data integrity, quality, and consistency throughout migration processes. Collaborate with cross-functional teams to understand business requirements and provide data solutions. Troubleshoot and resolve data migration issues and performance bottlenecks. Required Qualifications: 4–6 years of hands-on experience in data migration, data integration, and ETL development. Proficient in SQL and scripting languages such as Python or Shell. Experience with Google BigQuery and performance tuning. Knowledge of relational databases (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices.
Posted 1 month ago
3.0 - 7.0 years
45 - 50 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
1. Governance Strategy & Stakeholder Alignment - Develop and maintain enterprise data governance strategies, policies, and standards. - Align governance with business goals: compliance, analytics, and decision-making. - Collaborate across business, IT, legal, and compliance teams for role alignment. - Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation - Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. - Optimize Purview setup for large-scale environments (50TB+). - Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. - Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management - Design metadata repositories and maintain business glossaries and data dictionaries. - Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. - Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance - Define classification rules and sensitivity labels (PII, PCI, PHI). - Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. - Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management - Define KPIs and dashboards to monitor data quality across domains. - Collaborate on rule design, remediation workflows, and exception handling. - Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship - Maintain business glossary with domain owners and stewards in Purview. - Enforce approval workflows, standard naming, and steward responsibilities. - Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration - Automate governance processes using PowerShell, Azure Functions, Logic Apps. - Create pipelines for ingestion, lineage, glossary updates, tagging. - Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance - Set up dashboards for audit logs, compliance reporting, metadata coverage. - Oversee data lifecycle management across its phases. - Support internal and external audit readiness with proper documentation. Tools & Technologies: - Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog - Microsoft Purview capabilities: 1. Label creation & policy setup 2. Auto-labeling & DLP 3. Compliance Manager, Insider Risk, Records & Lifecycle Management 4. Unified Catalog, eDiscovery, Data Map, Audit, Compliance alerts, DSPM Required Qualifications: - 7+ years of experience in data governance and data management. - Proficient in Microsoft Purview and Informatica data governance tools. - Strong in metadata management, lineage mapping, classification, and security. - Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. - Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs. - Skilled in bridging technical governance with business and compliance goals.
Posted 1 month ago
0 years
3 - 5 Lacs
Chennai
On-site
DBT responsibilities include designing, developing, and handling technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. The candidate must be very strong in PL/SQL, including queries, procedures, and JOINs. Experience in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts using Unix, Python, etc., to perform Extract, Load, and Transform operations is essential. It is good to have knowledge and hands-on experience with Fivetran. Candidates who have worked in production support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures is required. Responsibilities also include performing data analysis, troubleshooting data issues, and providing technical support to end-users. The role involves developing and maintaining data warehouse and ETL processes, ensuring data quality and integrity. The ideal candidate should have a strong capability for complex problem-solving and a continuous improvement mindset. A DBT or Snowflake certification is desirable. Strong SQL coding, communication, and documentation skills are essential. Familiarity with Agile delivery processes is required. Candidates must be analytical, creative, and self-motivated, and should be able to work effectively within a global team environment. Excellent communication skills are a must. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
5.0 years
15 - 25 Lacs
Mumbai Metropolitan Region
On-site
Data Engineer – On-Site, India Industry: Enterprise Data Analytics & Digital Transformation Consulting. We architect and operationalize large-scale data platforms that power BI, AI, and advanced reporting for global clients across finance, retail, and manufacturing. Leveraging modern cloud services and proven ETL frameworks, our teams turn raw data into trusted, analytics-ready assets that accelerate business decisions. Role & Responsibilities Design, build, and optimize end-to-end ETL pipelines that ingest, cleanse, and transform high-volume datasets using SQL and ELT best practices. Create scalable data models and dimensional schemas to support reporting, dashboarding, and machine-learning use-cases. Develop and maintain batch and near-real-time workflows in Airflow or similar orchestration tools, ensuring fault tolerance and SLA compliance. Collaborate with analysts, data scientists, and product owners to translate business requirements into performant data solutions. Implement rigorous data quality checks, lineage tracking, and metadata management to guarantee trust and auditability. Tune queries, indexes, and storage partitions for cost-efficient execution across on-premise and cloud data warehouses. Skills & Qualifications Must-Have 5+ years hands-on experience as a Data Engineer or similar. Advanced SQL proficiency for complex joins, window functions, and performance tuning. Proven expertise in building ETL/ELT pipelines with tools such as Informatica, Talend, or custom Python. Solid understanding of dimensional modeling, star/snowflake schemas, and data-vault concepts. Experience with workflow orchestration (Airflow, Luigi, or equivalent) and version control (Git). Strong grasp of data quality frameworks and error-handling strategies. Preferred Exposure to cloud platforms (AWS Redshift, Azure Synapse, or Google BigQuery). Knowledge of containerization and CI/CD pipelines for data workloads. Familiarity with streaming technologies (Kafka, Kinesis) and real-time ETL patterns. Working knowledge of BI tools (Tableau, Power BI) and their data connectivity. Benefits & Culture Highlights Work with high-calibre data practitioners and cutting-edge cloud tech. Merit-driven growth path, certification sponsorships, and continuous learning stipends. Inclusive, innovation-first culture that rewards problem-solving and ownership. Skills: kafka,data warehouse,containerization,airflow,elt,luigi,error-handling strategies,git,aws redshift,talend,star schema,power bi,informatica,data vault,ci/cd,azure synapse,etl,sql,kinesis,performance tuning,data modeling,data quality frameworks,python,dimensional modeling,snowflake schema,tableau,google bigquery
Posted 1 month ago
18.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Role This is a senior leadership position within Business Information Management Practice. The individual is responsible for the overall vision, strategy, delivery and operations of key accounts in BIM. This requires working closely with global executive team, subject matter experts, solution architects, project managers and client teams to conceptualize, build and operate Big Data Solutions. Communicate with internal management, client sponsors and senior leaders on the project status, risks, solution, etc. Responsibilities Client Delivery Leadership Role Candidate to be responsible for delivering at least $10 M + revenue using information management solution(s): Big Data, Data Warehouse, Data Lake, GEN AI, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms (AWS/AZURE), SaaS/PaaS based solutions Practice and Team Leadership Role: Self-Driven for results - Able to take initiative and set priorities; pursue tasks tenaciously & with a need to finish. Able to overcome setbacks which may occur along the way. Customer Focus - Dedicated to meeting the expectations of internal and external clients. Problem Solving - Uses rigorous logic and methods to solve difficult problems with effective solutions. Probes all fruitful sources for answers. Is excellent at honest analysis. Looks beyond the obvious and doesn’t stop at the first answers. Learning on the Fly - Learns quickly when facing new problems. A relentless and versatile learner. Proven ability to handle multiple projects/programs while meeting deadlines and documenting progress towards those deadlines. Excellent communication skills (must be able to interface with both technical and business leaders in the organization). Leadership Skills to Coach, mentor and develop senior and middle level staff. Develop the manager layers to be leaders of the future. Be known as a Thought Leader in a specific aspect of Information Management technology spectrum or Pharma domain. Direct the training & skill enhancement of the team, in line with pipeline opportunities. Ability to lead large RFP responses, design and implement the solution for proposals and customer decks. Assist in generating order pipeline, road shows, develop Go-to-market strategy for regions & verticals. Create market facing collaterals as per requirements. Able to write white paper, blogs, technical/functional point of view. Qualifications MBA in Business Management Bachelor of Computer Science Required Skills Candidate should have 18+ years of prior experience (preferably including at least 5 yrs in Pharma Commercial domain) in delivering customer focused information management solution(s): Big Data, Data Warehouse, Data Lake, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms (AWS/AZURE), SaaS/PaaS based solutions. Should have successfully done 4-5 end to end DW implementations using technologies: Big Data, Data Management and BI technologies such as Redshift, Hadoop, ETL tools like Informatica/Matillion/Talend, BI tools like Qlik/MSTR/Tableau, Dataiku/Knime and Cloud Offerings from AWS/Azure. Ability to lead large RFP responses, design and implement the solution for proposals and customer decks. Should have led large teams of at least 100+ resources. Good communication, client facing and leadership skills. Hands on knowledge of databases, SQL, reporting solutions like BI tools or Excel/VBA. Preferred Skills Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Capability Building / Thought Leadership About the Company Axtria is a global provider of cloud software and data analytics to the Life Sciences industry. We help Life Sciences companies transform the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. We are acutely aware that our work impacts millions of patients and lead passionately to improve their lives. Since our founding in 2010, technology innovation has been our winning differentiation, and we continue to leapfrog competition with platforms that deploy Artificial Intelligence and Machine Learning. Our cloud-based platforms - Axtria DataMax™, Axtria InsightsMax™, Axtria SalesIQ™, and Axtria MarketingIQ™ - enable customers to efficiently manage data, leverage data science to deliver insights for sales and marketing planning and manage end-to-end commercial operations. With customers in over 75 countries, Axtria is one of the largest global commercial solutions providers in the Life Sciences industry. We continue to win industry recognition for growth and are featured in some of the most aspirational lists - INC 5000, Deloitte FAST 500, NJBiz FAST 50, SmartCEO Future 50, Red Herring 100, and several other growth and technology awards. Axtria is looking for exceptional talent to join our rapidly growing global team. People are our biggest perk! Our transparent and collaborative culture offers a chance to work with some of the brightest minds in the industry. Axtria Institute, our in-house university, offers the best training in the industry and an opportunity to learn in a structured environment. A customized career progression plan ensures every associate is setup for success and able to do meaningful work in a fun environment. We want our legacy to be the leaders we produce for the industry. Will you be next?
Posted 1 month ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Job Description Key Responsibilities Platform Configuration (IBP , S&OE , STD and TLB Modules) Configure o9’s Demand Planning, Supply Planning, Inventory Optimization, SOE (Sales Operation Execution), STD (Short Term Deployment), TLB (Truck Load Building) workflows to align with Mondelez’s global processes. Design Attribute-Based Hierarchies, Time Bucket Profiles, and Aggregation/Disaggregation Rules for multi-level planning (SKU, plant, region). Develop custom planning logic using o9’s planning engines (heuristics, time-series forecasting) and scripting tools (Python, R). o9 Data Model & Planning Engine Design Build master data structures (products, locations, customers) and key figures (demand signals, capacity constraints). Optimize pegging rules, supply allocation strategies, and exception management workflows for real-time S&OE adjustments. Configure Scenario Manager for what-if analysis (e.g., demand spikes, supply disruptions). Integration And Data Requirements Design data pipelines using Talend, Kafka, or REST APIs to synchronize o9 with SAP ECC, APO, BPC, and external market data sources. Validate data quality through o9 Loader and troubleshoot integration errors in collaboration with IT teams. Scenario Management & Analytics Develop interactive dashboards using o9’s KPI Engine to track forecast accuracy, service levels, and inventory turnover. Configure exception alerts for stockouts, excess inventory, and production bottlenecks. Testing, Validation, and Support Execute unit testing, integration testing, and UAT for o9 configurations, ensuring alignment with business requirements. Resolve post-go-live issues via ServiceNow tickets and provide Level 2/3 support for planners. Documentation & Training Create technical playbooks for configurations, including data mappings, calculation logic, and user permissions. Train regional planners on o9’s Tag UI, Workflow Designer, and Scenario Manager through hands-on workshops. Required Qualifications Education: Bachelor’s/Master’s in Computer Science, Supply Chain Management, Industrial Engineering, or related field. Experience: 5+ years configuring o9 IBP/S&OE modules (Demand 360, Supply 360) in CPG/FMCG industries. IBP, SOE (Sales Operation Execution), STD (Short Term Deployment), TLB (Truck Load Building) and Segmentation Experience required. 2+ full lifecycle implementations of o9 or SAP IBP, including data migration and cutover planning. Technical Skills: o9 Platform: Modeler, Workflow Designer, Data Ingestion, KPI Engine, Scenario Manager. Integration Tools: Snowflake, Talend, Kafka, REST APIs, SAP CPI, Oracle OIC. Data Modeling: Time-series forecasting, constraint-based supply models. Analytics: Power BI, Tableau, or o9 native dashboards. Tools: SQL, Jira, Confluence, ServiceNow. Preferred Experience Certifications: o9 Solutions, SAP IBP, or Kinaxis Rapid Response. Exposure to AI/ML-driven demand sensing or control tower frameworks. Agile/Scrum methodology experience with Jira board management. Work schedule: 3 days work from office/ 2 days work from home No Relocation support available Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Software & Applications Technology & Digital
Posted 1 month ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About iSteer Technologies: iSteer’s mantra of “taking the quantum leap ” with “smarter technology” solutions, is producing what customers wanted in a fraction of time, with fewer people, less overall cost and getting the product to market when it is still relevant to target customers. We truly believe “technology must not only support our customer’s business, but it must advance it at a faster pace”. At iSteer (ISO 27001:2013 certified), we enable our customers, achieve a competitive advantage with technology solutions and products that are derived from decades of cross-industry experience and technology expertise. iSteer helps transform the business of several global enterprises with their technology needs across industries. We provide Digital Integration, Data Engineering and connect enterprises by providing Robotic Process Automation, IOT, Cloud, and AI solutions , through our world-class product engineering expertise, our products like AppSteer make it easier to transform businesses digitally. We have exponentially grown our operations across the globe with 250+ employees at our offices in India, Singapore, United States, Canada and Dubai . Our expansion globally has always been a remarkable difference and delivers key results to our customers. Our renowned partners are Workato Platinum Partner, TIBCO Gold Partner and Dell Boomi. Life at iSteer, where a fine line of young and experienced minds leads into the infinite opportunities in the digital era. At iSteer, we make sure that talent meets technology in a culture which is driven by knowledge and growth. Being a part of iSteer makes you a stakeholder of achievements which will turn your latent potential into a success story. It is also enabled by excellence into our culture which encourages individual development, embraces an inclusive environment, rewards innovative excellence and supports our communities. Why join us ● Competitive salary + benefits ● We value work-life balance and encourage taking vacations ● Excellent leadership and an outstanding network of individuals ● On-going regular training certifications sponsored by iSteer. ● Know more about us and follow us @ https://isteer.com/ . Job Summary We are looking for a highly experienced Project Manager / Delivery Manager to lead the successful delivery of Data & Analytics (DnA) initiatives across complex enterprise environments. The ideal candidate will have over 15 years of hands-on experience managing large-scale data projects, including fixed-price engagements , with the ability to lead large delivery teams (30+ members) , ensure quality execution, and drive client satisfaction. Experience in pre-sales , stakeholder management, and a strong understanding of data technologies are essential. Preference will be given to candidates with domain experience in Financial Services and Manufacturing . Key Responsibilities: ● Own end-to-end project and program delivery for data and analytics initiatives. ● Successfully manage fixed-price and time-bound delivery models with strong commercial awareness. ● Lead cross-functional teams of 30+ professionals including data engineers, analysts, and architects. ● Collaborate with client stakeholders to define project scope, goals, timelines, and deliverables. ● Monitor and manage project budgets, schedules, risks, and quality standards. ● Participate in pre-sales efforts, including solutioning, estimation, proposals, and presentations. ● Drive continuous improvement in delivery practices, tools, and methodologies. ● Ensure adherence to governance frameworks and compliance with data security standards. ● Coordinate with technical leads to ensure alignment between project objectives and architectural solutions. ● Provide strategic input into account growth and customer engagement. Required Skills & Qualifications: ● 15+ years of experience in project/delivery management within data & analytics programs. ● Proven track record in managing fixed-price projects from inception to go-live. ● Hands-on experience leading large delivery teams (30+ members) across geographies. ● Exposure to pre-sales functions including RFP/RFI response, client discussions, and solutioning. ● Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and BI tools (e.g., Power BI, Tableau, Qlik). ● Strong understanding of data management, data governance, and data warehousing concepts. ● Certified PMP and/or Scrum Master . ● MDM knowledge will be a add on ● TOGAF certification is an added advantage. ● Domain experience in Financial Services and/or Manufacturing is highly preferred.
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
As the Head of Integration Platforms (f/m/d) , your main responsibility includes leading and overseeing the management and operations of integration platforms and seamless integration solutions, developing, managing, and monitoring platforms for real-time data streaming, batch processes, and API integrations to ensure a trusted and scalable data integration service within Siemens. In this role, you will actively contribute to strategic programs and projects aimed at driving Siemens towards becoming a seamlessly interconnected tech company. You will be responsible for typical project management tasks such as budgeting, planning, and steering activities. Additionally, you will focus on implementing strategic initiatives that align with our long-term mission and strategy. Collaboration and engagement with key stakeholders will be vital to your success. You will work closely with them, fostering constant exchange and collaboration to create a collaborative and productive working environment across virtual organizations. Supporting and empowering your colleagues and teams will be a priority. You will also play a crucial role in driving change management processes, working alongside core activities within the organization. Your expertise and leadership will enable Siemens to effectively leverage cutting-edge integration technologies and advance our data integration capabilities. Your problem-solving mindset creates real business value. This is your role. Join us in this exciting role where your problem-solving mindset will create tangible business value. As the Head of Integration Platforms, you will play a pivotal role in enabling our vision of creating a seamlessly connected IT ecosystem through the development of scalable integration platforms, the management of real-time data streaming, batch processes, and API integrations across Siemens. Your primary responsibility will be to build and lead a high-performing team of integration architects and engineers who will take end-to-end ownership of their platforms and services. You will support the team in identifying priorities, addressing project issues, and providing individual mentorship and personal development opportunities. In addition, you will foster an agile mindset and way of working, encouraging the adoption of agile methods to drive efficiency and effectiveness. You will actively drive critical project-related activities, such as budgeting, planning, risk mitigation, and reporting, ensuring successful management of time, effort, and quality. Building strong relationships with our internal clients and stakeholders will be key, as you become a trusted advisor for their most critical integration workloads, supporting Siemens' transformation into a unified ONE tech company. You will take ownership of key projects and collaborate closely with stakeholders from the business to lead these initiatives to success. Staying up to date with emerging technologies in data integration and recommend suitable tools and platforms for adoption. Lead the implementation of such cutting-edge technologies enhancing data integration and accessibility. Use Your Skills To Move The World Forward. Education: University Degree in Mathematics, Computer Science, Business Administration, or a related quantitative field, or equivalent practical experience Experience & Skills Over 3 years of experience in leading international teams More than 2 years of experience in people management, either as a project lead or team lead Excellent communication and mediation skills Over 5 years of experience in program management within the field of data integration, analytics, and/or AI Proficiency in analyzing and implementing integration strategies and architectures Experience working in cloud environments, such as AWS, Azure, or GCP Knowledge and experience with Confluent Kafka, SnapLogic, Talend, and serverless integration capabilities Ability to comprehend technical concepts and translate them into business terms Customer-centric mindset with strong communication abilities Familiarity with agile methodologies, such as SCRUM Experience gathering business/functional requirements and translating them into integration solutions Experience in change management In-depth understanding of integration technologies such as data streaming, database replication, and API management Languages: Business fluency in written and spoken English You are much more than your qualifications, and we believe in the potential of every single candidate. We look forward to getting to know you! Your individual personality and perspective are important to us. We create a working environment that reflects the diversity of the society and support you in your personal and professional development. Let’s get to know your authentic personality and create a better future together with us. As an equal-opportunity employer we are happy to consider applications from individuals with disabilities. What We Offer You An attractive remuneration package Access to Siemens share plans 30 days of paid vacation and a variety of flexible work schedules that allow time off for you and your family 2 to 3 days of mobile working per week as a global standard Up tp 30 days workation per year in certain countries (Global) development programs that can be customized according to your wishes and ambitions Since each of over 300,000 team members feels that other benefits are particularly important, and we cannot list our entire benefit portfolio here, you can find more information here. The individual benefits are subject to regulatory, contractual, or corporate conditions. About Us We have lots of ideas about how to successfully drive digitalization in companies. For example, with open cloud platforms, highly developed security systems, and clever tools for developers. What is your role in this? Move the world from behind the scenes with your IT expertise and passion for game-changing information technology. Rethink IT and steer projects in completely new directions. Be bold when others would have given up. In short, play a key role in driving digitalization forward! We’ll provide the resources you need to do this. We also offer a variety of opportunities to get involved and be part of a global network of IT experts and professionals. Welcome to our world! www.siemens.de/careers – if you would like to find out more about jobs & careers at Siemens. FAQ – if you need further information on the application process.
Posted 1 month ago
0 years
0 Lacs
Chennai
On-site
DBT responsibilities include designing, developing, and handling technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. The candidate must be very strong in PL/SQL, including queries, procedures, and JOINs. Experience in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts using Unix, Python, etc., to perform Extract, Load, and Transform operations is essential. It is good to have knowledge and hands-on experience with Fivetran. Candidates who have worked in production support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures is required. Responsibilities also include performing data analysis, troubleshooting data issues, and providing technical support to end-users. The role involves developing and maintaining data warehouse and ETL processes, ensuring data quality and integrity. The ideal candidate should have a strong capability for complex problem-solving and a continuous improvement mindset. A DBT or Snowflake certification is desirable. Strong SQL coding, communication, and documentation skills are essential. Familiarity with Agile delivery processes is required. Candidates must be analytical, creative, and self-motivated, and should be able to work effectively within a global team environment. Excellent communication skills are a must. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
8.0 years
0 Lacs
Greater Chennai Area
On-site
We are looking for immediate joiners or candidates with short notice for the below positions. Kindly send your profile to SaiGeetha.Venkatesan@Dexian.com Sr. Full Stack Developer (React/Python) Experience - 8 to 15 Years. Location - Chennai & Bangalore Shift Time - 2:00 PM to 11:00 PM IST. Technical Proficiency: • Expert Front-end React Framework & Backend Python Experience • Proficient in front-end technologies such as HTML, CSS, Strong back-end development skills, or similar languages. • Proficient GIT, & CI/CD experience. • Develop and maintain web applications using modern frameworks and technologies • Help maintain code quality, organization, and automation • Experience with relational database management systems. • Familiarity with cloud services (AWS, Azure, or Google Cloud – Primarily Azure). Big Data Engineer Chennai, Pune, Hyderabad (Hybrid) Shift: 2 PM to 11 PM · Overall 5+ and Minimum 3 Years of relevant experience in handling Big Data using Spark environment. · Terabyte-scale data experience · Python - Data processing (Must) & Document extraction – Text or PDF (Good to have) · JSON API handling (Nested JSON) · NiFi (or equivalent ETL - Airflow, Talend, or Glue) · Advanced SQL querying – Ask those SQL functions · Data Pipeline monitoring and issue resolution · SOLR or similar search engine - Apache SOLR, Elasticsearch, or similar? (Good to have) · Azure scripting and cloud familiarity (Good to have) Java with Kotlin (2 positions) and Java with Angular (2 positions)
Posted 1 month ago
5.0 - 10.0 years
4 - 6 Lacs
Noida
On-site
Key Responsibilities: 1. Develop and maintain ETL jobs using Talend Data Integration suite for batch and real-time data processing. 2. Write and optimize complex SQL scripts, queries, and analytical functions for data transformation and validation. 3. Recreate, enhance, and troubleshoot stored procedures, functions, and packages in Oracle and Vertica environments. 4. Perform impact analysis and data lineage tracking for changes in source systems and downstream dependencies. 5. Migrate legacy ETL workflows and SQL logic to Talend-based frameworks. 6. Implement data quality and data profiling checks to ensure reliability and accuracy of ingested data. 7. Support data reconciliation and validation efforts between source and target systems. 8. Collaborate with DBAs and infrastructure teams to optimize data storage, indexing, and partitioning strategies. 9. To participate in code reviews, version control, and deployment automation for ETL scripts and database code. 10. Troubleshoot and resolve production data issues and support root cause analysis. Education and Certifications: Mandatory – Bachelor's degree (B.Tech /B.E.) Technical Skills, Knowledge & Abilities Proficient in using Talend ETL suite for data integration and transformation. Deep working knowledge of Oracle, Microsoft SQL Server, and PostgreSQL databases. Exposure to Vertica as a data warehouse solution. Expertise in SQL query development for data extraction, manipulation, and analysis. Solid understanding of designing and maintaining stored procedures across multiple RDBMS. Knowledge of data security, integration, and interoperability best practices in data engineering. Familiarity with data warehousing concepts, including OLAP and data cubes. Experience with metadata management and implementing data quality frameworks. Programming skills in Python, SQL, or Java for building data workflows and automation scripts. Work Experience: 5-10 years of relevant experience
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
🚨 We’re Hiring – Talend Support Engineer! 🚨 Cornerstone IT Solutions is expanding our team and we are looking for a Talend Support Engineer to join us on a critical client engagement involving Talend Open Studio & Admin Center (v7.3 & v8). 💡 What You'll Do: Troubleshoot ETL job failures and studio issues Handle license, patching, and connectivity support Assist with Git, encryption, and SSL configurations Work directly with client teams to meet SLAs 🎯 Who We're Looking For: 4+ years of experience with Talend Open Studio Solid understanding of Java, SSL, and Talend components Experience in support/ticketing environments Strong communication and analytical skills 🌍 Location : Hyderabad (India) ⏱ Working Hours : London Business Hours (9 AM – 6 PM GMT) 💼 Client-Facing Role | 🔐 Immediate Joiners Preferred 📩 Interested or know someone who’s a fit? Drop your resume at hello@cornerstoneitsols.com or DM us directly! #TalendJobs #ETLSupport #TalendSupportEngineer #Hiring #CornerstoneITSolutions
Posted 1 month ago
8.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
ob Summary: We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering environment. Key Responsibilities: Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL workflows Mandatory Skills: ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data workflows Good-to-Have Skills: API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling)
Posted 1 month ago
0.0 - 6.0 years
18 - 25 Lacs
Bengaluru, Karnataka
On-site
Job Title: ETL Development Lead (7+ years) Location : Bangalore, Hyderabad, Chennai, Pune, Vadodara Work Mode: Hybrid Job Type: Full-Time Shift Timings: 2:00 - 11:00 PM Budget: 18 to 25 LPA Job Description : Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up to date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e., EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e., Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, App Flow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor’s degree in computer science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Types: Full-time, Permanent Pay: ₹1,800,000.00 - ₹2,500,000.00 per year Schedule: Day shift Monday to Friday Ability to commute/relocate: Bangalore, Karnataka: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: ETL: 7 years (Required) Talend: 7 years (Required) Python: 6 years (Required) Work Location: In person
Posted 1 month ago
5.0 - 10.0 years
22 - 27 Lacs
Pune, Bengaluru
Work from Office
Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Responsibilities The Sr. Integration Developer (Senior Software Engineer) will work in the Professional Services Team and play a significant role in designing and implementing complex integration solutions using the Adeptia platform. This role requires hands-on expertise in developing scalable and efficient solutions to meet customer requirements. The engineer will act as a key contributor to the team's deliverables while mentoring junior engineers. They will ensure high-quality deliverables by collaborating with cross-functional teams and adhering to industry standards and best practices. Responsibilities include but not limited to: Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Essential Skills: Technical 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft , or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks Desired Skills: Technical Familiarity with JavaScript frameworks like ReactJS, AngularJS, or NodeJS. Exposure to integration standards (EDI, EDIFACT, IDOC). Experience with modern web UI tools and frameworks. Exposure to DevOps tools such as Git, Jenkins, and CI/CD pipelines. About Adeptia Adeptia believes business users should be able to access information anywhere, anytime by creating data connections themselves, and its mission is to enable that self-service capability. Adeptia is a unique social network for digital business connectivity for “citizen integrators” to respond quickly to business opportunities and get to revenue faster. Adeptia helps Information Technology (IT) staff to manage this capability while retaining control and security. Adeptia’ s unified hybrid offering — with simple data connectivity in the cloud, and optional on-premises enterprise process-based integration — provides a competitive advantage to 450+ customers, ranging from Fortune 500 companies to small businesses. Headquartered in Chicago, Illinois, USA and with an office in Noida, India, Adeptia provides world-class support to its customers around-the-clock. For more, visit www.adeptia.com Our Locations: India R&D Centre: Office 56, Sixth floor, Tower-B, The Corenthum, Sector-62, Noida, U.P. US Headquarters: 332 S Michigan Ave, Unit LL-A105, Chicago, IL 60604, USA
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description BI Analyst- (Senior Engineer/ Lead) We at Pine Labs are looking for those who share our core belief - Every Day is Game day. We bring our best selves to work each day to realize our mission of enriching the world through the power of digital commerce and financial services. Role Purpose We are looking for a Sr. BI Analyst / Lead who will be Supporting BI Analysts team in implementation of a new dashboard features and writing complex SQL queries to get the raw data ready for dashboarding usage. Preferred candidate should have analytics mindset to convert raw data into user friendly and dynamic dashboards along with developing Paginated Reports. This is an Individual Contributor position who can lead the team from Technical front. Responsibilities We Entrust You With Participates in peer reviews of Reports/ Dashboards created by Internal team members and ensure high standard as per defined reporting/dashboarding standards. Designing Product thinking, Problem solving, Strategic Orientation Must have expertise on Apache SuperSet BI Tools and SSRS. Excellent skills for SSRS, SSIS and Expert in SQL Scripts. Nice to have, Sound knowledge on AWS QuickSight, Powershell Excellent SQL Scripting for complex queries Proficient in both verbal and non-verbal communication Knowledge in ETL Concept and tools e.g. Talend/SSIS Knowledge in Query Optimization in SQL and Redshift Nice to have, Sound knowledge on Data Warehousing and Data Lake Concepts Understands requirement of a Dashboard/Report from Management stake holders and has analytical view to design dynamic dashboards using any BI Analytics tool Required Skills : TSQL, ANSI SQL, PSQL, SSIS, SSRS, Apache Superset, AWS Redshift, QuickSight Good to have skills : Data Lake concepts Analytical Ability, Business and Merchant requirement understanding What Matters In This Role Apache Superset, AWS QuickSight, SSRS, SSIS for developing Dashboards is preferred Excellent TSQL, ANSI SQL, Data Modeling and Querying from multiple Data Stores is mandatory. Experience on Microsoft SSRS and SSIS is needed for developing Paginated Dashboards What We Value In Our People You take the shot : You Decide Fast and You Deliver Right You are the CEO of what you do: you show ownership and make things happen You own tomorrow : by building solutions for the merchants and doing the right thing You sign your work like an artist: You seek to learn and take pride in the work you do (ref:hirist.tech)
Posted 1 month ago
10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Digital Finance Manager – India The ideal candidate will act as the bridge between Finance and IT, bringing hands-on expertise in tools like SAP, Power BI, Alteryx, and RPA platforms, and will play a pivotal role in identifying and delivering finance automation projects aligned with business needs. Purpose of the Role To drive the Digital Finance India Agenda, aligned with Mondelez India SP, by: Bringing in best-in-class business practices, Evaluating digital technologies, Engaging finance and business stakeholders, Driving automation and simplification of financial processes, Enabling future-ready finance operations with minimum manual intervention. Role Overview Acts as a bridge between Finance sub-functions and IT Services. It would also be responsible to identify opportunities, find solutions and its implementations for various processes which are inter-twined between Finance and other functions. You will be responsible for ensuring that Finance IBS projects are successfully delivered on time and on budget. This includes project governance, budget and timeline development, build quality, testing and operational readiness, and the completed project’s readiness to go live; work with project resources to provide design collateral and to configure software components so they are aligned with security policy and governance; and ensure adherence to development and configuration standards and processes. Focuses on identifying automation opportunities across finance processes—especially those that are currently manual (e.g., cash flow statements, reconciliation, reporting). Leads and governs end-to-end project delivery within time and budget (including testing, design, rollout readiness). Drives process redesign and software configuration aligned with security and compliance standards. Important Note : This is not a pure IT role. It requires strong finance acumen and the ability to understand financial reporting, controls, compliance, and analysis needs while embedding digital solutions. Key Accountabilities Develop and implement short, medium, and long-term digital strategies for Finance India. Identify, evaluate, and implement finance automation opportunities (internal + external). Deliver data transformation, automation, visualization, and dashboarding solutions. Manage digital finance projects, ensuring timelines, budgets, and stakeholder expectations are met. Evaluate current finance processes to identify areas for automation, controls improvement, and simplification. Implement new digital tools to improve efficiency and competitiveness. Train finance teams on emerging tools and technologies. Be the go-to digital expert within Finance for process innovation. Collaborate with global and regional stakeholders, including Global Finance Solution Owners and Business Tower leads. Translate business requirements into functional and technical specs. Qualifications & Experience CA or MBA from a reputed university. 8–10 years of progressive experience in finance transformation, with strong focus on analysis, reporting, and forecasting Demonstrated expertise in digital tools relevant to finance, including: SAP (S/4HANA, Hyperion, SAP Analytics Cloud) Power BI, Tableau Robotic Process Automation (RPA) Low-Code/No-Code Platforms Hands-on experience in data engineering and analytics tools, such as: Alteryx, Collibra, Talend, Microsoft platform Exposure to finance transformation or consulting, ideally within the FMCG industry, is a strong plus. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Mondelez India Foods Private Limited (formerly Cadbury India Ltd.) has been in India for over 70 years, making sure our mouth-watering and well-loved local and global brands such as Cadbury chocolates, Bournvita and Tang powdered beverages, Oreo and Cadbury Bournvita biscuits, and Halls and Cadbury Choclairs Gold candies get safely into our customers hands—and mouths . Headquartered in Mumbai, the company has more than 3,300 employees proudly working across sales offices in New Delhi, Mumbai, Kolkata and Chennai and in manufacturing facilities at Maharashtra, Madhya Pradesh, Himachal Pradesh and Andhra Pradesh, at our global Research & Development Technical Centre and Global Business Hub in Maharashtra and in a vast distribution network across the country. We are also proud to be recognised by Avatar as the Best Companies for Women in India in 2019 – the fourth time we’ve received this award. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Finance Planning & Performance Management Finance
Posted 1 month ago
9.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description This is a remote position. Job Description We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modeling. This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault 2.0. Responsibilities Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze → silver → gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and OpenMetadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Requirements Essential Skills: Job Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions,Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modeling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great Expectations. Personal Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness Must be highly collaborative and team oriented with commitment to excellence. Preferred Skills Job Proficiency in SQL and at least one programming language (e.g., Python, Scala). Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Personal Demonstrate proactive thinking Should have strong interpersonal relations, expert business acumen and mentoring skills Have the ability to work under stringent deadlines and demanding client conditions Ability to work under pressure to achieve the multiple daily deadlines for client deliverables with a mature approach Other Relevant Information Bachelor’s in Engineering with specialization in Computer Science or Artificial Intelligence or Information Technology or a related field. 9+ years of experience in data engineering and data architecture. LeewayHertz is an equal opportunity employer and does not discriminate based on race, color, religion, sex, age, disability, national origin, sexual orientation, gender identity, or any other protected status. We encourage a diverse range of applicants. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France