Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Pune, Maharashtra, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a hands-on and motivated Azure DataOps Engineer to support our cloud-based data operations and workflows. This role is ideal for someone with strong foundational knowledge of Azure data services and data pipelines who is looking to grow in a fast-paced environment. You will work closely with senior engineers and analysts to manage data pipelines, ensure data quality, and assist in deployment and monitoring activities. Your Key Responsibilities Support the execution and monitoring of Azure Data Factory (ADF) pipelines and Azure Synapse workloads. Assist in maintaining data in Azure Data Lake and troubleshoot ingestion and access issues. Collaborate with the team to support Databricks notebooks and manage small transformation tasks. Perform ETL operations and ensure timely and accurate data movement between systems. Write and debug intermediate-level SQL queries for data validation and issue analysis. Monitor pipeline health using Azure Monitor and Log Analytics, and escalate issues as needed. Support deployment activities using Azure DevOps pipelines. Maintain and update SOPs, assist in documenting known issues and recurring tasks. Participate in incident management and contribute to resolution and knowledge sharing. Skills And Attributes For Success Strong understanding of cloud-based data workflows, especially in Azure environments. Analytical mindset with the ability to troubleshoot data pipeline and transformation issues. Comfortable working with large datasets and navigating both structured and semi-structured data. Ability to follow runbooks, SOPs, and collaborate effectively with other technical teams. Willingness to learn new technologies and adapt in a dynamic environment. Good communication skills to interact with stakeholders, document findings, and share updates. Discipline to work independently, manage priorities, and escalate issues responsibly. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Working knowledge of Azure Data Factory, Data Lake, and Synapse Exposure to Azure Databricks – ability to understand and run existing notebooks Understanding of ETL processes and data flow concepts Good to have Experience with Power BI or Tableau for basic reporting and data visualization Exposure to Informatica CDI or any other data integration platform Basic scripting knowledge in Python or PySpark for data processing or automation tasks Proficiency in writing SQL for querying and analyzing structured data Familiarity with Azure Monitor and Log Analytics for pipeline monitoring Experience supporting DevOps deployments or familiarity with Azure DevOps concepts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Engineer-Investment Exp:-6-12 Yrs Location :- Hyderabad Primary Skills :- ETL, Informatica,SQL,Python and Investment domain Please share your resumes to jyothsna.g@technogenindia.com, Job Description :- •7-9 years of experience with data analytics, data modeling, and database design. •3+ years of coding and scripting (Python, Java, Scala) and design experience. •3+ years of experience with Spark framework. •5+ Experience with ELT methodologies and tools. •5+ years mastery in designing, developing, tuning and troubleshooting SQL. •Knowledge of Informatica Power center and Informatica IDMC. •Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. •Strong data analysis skills for extracting insights from financial data •Proficiency in reporting tools (e.g., Power BI, Tableau). T he Ideal Qualifications Technical Skills: •Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. •Familiarity with regulatory requirements and compliance standards in the investment management industry. •Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. •Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: •Strong analytical and problem-solving abilities. •Exceptional communication and interpersonal skills. •Ability to influence and motivate teams without direct authority. •Excellent time management and organizational skills, with the ability to prioritize multiple initiatives.
Posted 1 week ago
10.0 years
0 Lacs
Delhi, India
On-site
Where Data Does More. Join the Snowflake team. At the forefront of the data revolution, Snowflake is building the world’s greatest data and applications platform. Our ‘get it done’ culture fosters innovation, impact, and collaboration. We are rapidly expanding our partner Go-To-Market initiatives with System Integrators, Cloud Service Providers, and Data Cloud Partners, who are crucial in helping customers leverage the Snowflake AI Data Cloud. We seek a self-driven individual with excellent English verbal and written communication skills to grow these partnerships, engaging both local and global teams. One of the unique benefits of Snowflake’s architecture is the ability to securely share data, applications and solutions with other Snowflake accounts without creating a copy of the data. The Snowflake Data Cloud builds on our secure data sharing functionality to be the ‘App Store’ for data, enabling providers and consumers to publish/discover and monetize data, applications and solutions. Providers to the Snowflake Marketplace use Data Sharing as the means to deliver their data or service, replacing traditional delivery methods such as files and APIs. Data Sharing and the Marketplace play a key strategic role in our Data Cloud vision and drive the network effect of the Data Cloud! Success in this position requires the candidate to be a technical advisor by aligning with key programs and educating/upskilling partners on these key product features. The candidate will present skills to both technical and executive audiences, whether it's white boarding or using presentations and demos to build mind share among Snowflake Data Cloud and SI Partners in India. We are looking for a technical member who understands the data and applications partner ecosystem as well as how to grow and manage content partnerships. In addition to technically onboarding and enabling partners, you will be an important guide in the creation of the Go-to-Market for new partners. This position will be based in Mumbai and occasional travel to partner sites or industry events within India may be required. As A Partner Solution Engineer, You Will: Technically on board and enable partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborate with partners to develop Snowflake solutions in customer engagements. You will be working with our partners to create assets and demos, build hands-on POCs and pitch Snowflake solutions. Help Solution Providers/Practice Leads with the technical strategies that enables them to sell their offerings on Snowflake Keeping Partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about latest technology solutions and benefits Run technical enablement programs to provide best practices, and solution design workshops to help Partners create effective solutions. Drive strategic engagements by quickly grasping new concepts and articulating their business value. Showcase the impact of Snowflake through compelling customer success stories and case studies. Strong understanding of how Partners make revenue through the Industry priorities & complexities they face and influence where Snowflake products can have the most impact for their product services Conversations with other technologists, providing presentations at the C-level. Preferred skill sets and experiences: Have a total of 10+ years of relevant experience. Experience working with Tech Partners, ISVs and System Integrators (SIs) in India. Develop data domain thought leadership within the partner community. Providing technical product and deep architectural expertise & latest product capabilities with our Partner Solution Architect community based in India. Presales or hands-on experience with Data Warehouse, Data Lake or Lakehouse platform. Presales or hands-on experience in designing and building highly scalable data pipelines using Spark, Kafka to ingest data from various systems. Experience with our partner integration ecosystem like Alation, FiveTran, Informatica, dbtCloud etc are plus. Have hands-on experience and strong knowledge of Docker and how to containerize Python-based applications. Have knowledge of Container networking and Kubernetes. Have working knowledge of and integration with API’s Have proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops Presales or hands-on experience using Big Data or Cloud integration technologies such as Azure Data Factory, AWS Glue, AWS Lambda, etc. Experience in the AI/ML domain is a plus. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 1 week ago
9.0 years
15 Lacs
India
On-site
Experience- 9+ years Location: Pune, Hyderabad (Preferred) JD- Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Databricks, PySpark, SQL) Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity. Experience in create the Technical Specification Design, Application Interface Design. Files Processing – XML, CSV, Excel, ORC, Parquet file Formats Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data Good to have experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies (Informatica preferred) Demonstrated in depth skills with Azure Data Factory, Azure Databricks, PySpark, ADLS (must have) with the ability to configure and administrate all aspects of Azure SQL DB. Collaborate and engage with BI & analytics and business team Deep understanding of the operational dependencies of applications, networks, systems, security and policy (both on premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Fixed shift Application Question(s): How many years of total experience do you currently have? How many years of experience do you have with Azure data services? How many years of experience do you have with Azure Databricks? How many years of experience do you have with PySpark? What is your current CTC? What is your expected CTC? What is your notice period/ LWD? Are you comfortable attending L2 interview face to face in Hyderabad or Pune office? What is your current and preferred location?
Posted 1 week ago
8.0 years
4 - 8 Lacs
Hyderābād
On-site
About the job: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people’s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. As part of the Digital M&S Foundations organization, the data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational and dimensional databases. These solutions support Manufacturing and Supply Data and Analytical products and other business interests. What you will be doing: Be responsible for the development of the conceptual, logical, and physical data models in line with the architecture and platforms strategy Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively with the M&S teams Demonstrate a strong expertise in one of the following functional business areas of M&S: Manufacturing, Quality or Supply Chain Main Responsibilities Design and implement business data models in line with data foundations strategy and standards Work with business and application/solution teams to understand requirements, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, and analytic models. Hands-on data modeling, design, configuration, and performance tuning Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Bachelor’s or master’s degree in computer/data engineer technical or related experience. 8+ years of hands-on relational, dimensional, and/or analytic experience, including 5+ years of hands-on experience with data from core manufacturing and supply chain systems such as SAP, Quality Management, LIMS, MES, Planning Experience hands-on programing in SQL Experience with data warehouse (Snowflake), data lake (AWS based), and enterprise big data platforms in a pharmaceutical company. Good knowledge of metadata management, data modeling, and related tools: Snowflake, Informatica, DBT Experience with Agile Good communication, and presentation skills Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue Progress. And let’s discover Extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!
Posted 1 week ago
10.0 - 12.0 years
4 - 5 Lacs
Hyderābād
On-site
Job Summary We are seeking an experienced professional for the role of TL-Media Intelligence with a strong background in SF B2B CommerceCloud Payment SAP CDC CAIM for B2B and Informatica SaaS solutions. The ideal candidate will have 10 to 12 years of experience and will work from our office with rotational shifts. This role does not require travel. Responsibilities Lead the implementation and optimization of SF B2B CommerceCloud Payment solutions to enhance business processes and customer experiences. Oversee the integration of SAP CDC CAIM for B2B to ensure seamless data flow and accurate customer insights. Provide expertise in Informatica SaaS CAI to streamline data integration and improve operational efficiency. Manage Informatica SaaS Business 360 to support comprehensive business data management and analytics. Collaborate with cross-functional teams to align media intelligence strategies with organizational goals. Develop and maintain documentation for system configurations and processes to ensure clarity and consistency. Monitor system performance and troubleshoot issues to maintain optimal functionality and user satisfaction. Conduct training sessions for team members to enhance their understanding and usage of implemented technologies. Evaluate emerging technologies and recommend improvements to current systems and processes. Ensure compliance with industry standards and regulations to protect company and customer data. Analyze data trends and provide actionable insights to drive business growth and innovation. Support the development of strategic plans to leverage media intelligence for competitive advantage. Foster a culture of continuous improvement and innovation within the team. Qualifications Demonstrated expertise in SF B2B CommerceCloud Payment with a proven track record of successful implementations. Extensive experience with SAP CDC CAIM for B2B showcasing the ability to manage complex customer data. Proficiency in Informatica SaaS CAI and Business 360 highlighting skills in data integration and management. Strong analytical skills with the ability to translate data into actionable business strategies. Excellent communication and collaboration abilities to work effectively with diverse teams. A commitment to staying updated with the latest industry trends and technologies. Certifications Required Certified Salesforce B2B Commerce Cloud Specialist Informatica Certified Professional
Posted 1 week ago
170.0 years
0 Lacs
Hyderābād
On-site
Country/Region: IN Requisition ID: 27524 Work Model: Position Type: Salary Range: Location: INDIA - HYDERABAD - BIRLASOFT OFFICE Title: Azure Databricks Developer Description: Area(s) of responsibility About Us: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Azure Data Engineer with Databricks (7+ Years) Experience: 7+ Years Job Description: Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Databricks, PySpark, SQL) Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity. Experience in create the Technical Specification Design, Application Interface Design. Files Processing – XML, CSV, Excel, ORC, Parquet file Formats Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data Good to have experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies (Informatica preferred) Demonstrated in depth skills with Azure Data Factory, Azure Databricks, PySpark, ADLS (must have) with the ability to configure and administrate all aspects of Azure SQL DB. Collaborate and engage with BI & analytics and business team Deep understanding of the operational dependencies of applications, networks, systems, security and policy (both on premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.),
Posted 1 week ago
3.0 - 7.0 years
2 - 10 Lacs
India
Remote
Job Title: ETL Automation Tester (SQL, Python, Cloud) Location: [On-site / Remote / Hybrid – City, State or “Anywhere, USA”] Employment Type: [Full-time / Contract / C2C / Part Time ] NOTE : Candidate has to work US Night Shifts Job Summary: We are seeking a highly skilled ETL Automation Tester with expertise in SQL , Python scripting , and experience working with Cloud technologies such as Azure, AWS, or GCP . The ideal candidate will be responsible for designing and implementing automated testing solutions to ensure the accuracy, performance, and reliability of ETL pipelines and data integration processes. Key Responsibilities: Design and implement test strategies for ETL processes and data pipelines. Develop automated test scripts using Python and integrate them into CI/CD pipelines. Validate data transformations and data integrity across source, staging, and target systems. Write complex SQL queries for test data creation, validation, and result comparison. Perform cloud-based testing on platforms such as Azure Data Factory, AWS Glue, or GCP Dataflow/BigQuery. Collaborate with data engineers, analysts, and DevOps teams to ensure seamless data flow and test coverage. Log, track, and manage defects through tools like JIRA, Azure DevOps, or similar. Participate in performance and volume testing for large-scale datasets. Required Skills and Qualifications: 3–7 years of experience in ETL/data warehouse testing. Strong hands-on experience in SQL (joins, CTEs, window functions, aggregation). Proficient in Python for automation scripting and data manipulation. Solid understanding of ETL tools such as Informatica, Talend, SSIS, or custom Python-based ETL. Experience with at least one Cloud Platform : Azure : Data Factory, Synapse, Blob Storage AWS : Glue, Redshift, S3 GCP : Dataflow, BigQuery, Cloud Storage Familiarity with data validation , data quality , and data profiling techniques. Experience with CI/CD tools such as Jenkins, GitHub Actions, or Azure DevOps. Excellent problem-solving, communication, and documentation skills. Preferred Qualifications: Knowledge of Apache Airflow , PySpark , or Databricks . Experience with containerization (Docker) and orchestration tools (Kubernetes). ISTQB or similar testing certification. Familiarity with Agile methodologies and Scrum ceremonies . Job Types: Part-time, Contractual / Temporary, Freelance Contract length: 6 months Pay: ₹18,074.09 - ₹86,457.20 per month Expected hours: 40 per week Benefits: Work from home
Posted 1 week ago
4.0 years
0 Lacs
Noida
Remote
Hello, Edgesys consulting, a US based software consulting company, is currently seeking qualified and talented candidates for the following position. I came across your resume. Your resume seems to be a good fit, kindly review the Job Description below and let me know if this would be of interest. Job Description As A Bench Expertise, You Will Be Primarily Responsible For All Activities Stated Below.Main Objective Will Be Seek Clients/Sub Clients For Edgesys Bench Candidates From Job Portals, Personal Database, Networking, And Referencing. Experience must be at least 4+ years in Bench Sales Desired Candidate Profile • Excellent Knowledge Of Us It Bench Sales Process Responsible for Bench Sales of Edgesys candidates list on various skill and expertise levels for H1/Gc/Us citizens Will Be Marketing The Candidates To Mid Clients And Placing Them On Contract Positions On Corp To Corp Basis .• Will Be Working Independently Building A Database On Mid Clients Should Have Pre-Set Process Of Marketing Bench Candidates And Also Have Own Existing Database Of Clients/Sub-Clients. Should Have Excellent Knowledge &Amp; Understanding On Complete Life Cycle Of The Bench Sales / Bench Marketing. Must Be Able To Train The Team / Resources To Effectively Market The Bench Consultants &Amp; Get The Placements In Shortest Time Possible For Sr. Req. Should Be Able To Handle A Team Must Be Very Good At Marketing The Bench Resources Ensuring Minimum Or No Candidate Remains On Bench. Engaging The Bench Resource To Ensure Proper Positioning For Upcoming Requirements And Ensure Better Selection Rate Hands On Experience In End To End Bench Sales Process Across Various It Technologies Such As Technologies Like Project Manager, Ba, Qa, Java, Dotnet, Sharepoint, Php, Business Analyst, B.I., Micro Strategy, C/ C , Informatica, Mainframes, As/ 400, Filenet, Salesforce Pega, Network/ Windows Admin, Oracle Apps / Dba, Ms Sql Developer / Dba, Peoplesoft, Sap &Amp; Other Such Skills Pre-Screen Clients, Evaluating Candidate Compatibility With Specific Job Requirements, Ensuring Right Fit Before Submissions To Client. Well Versed And Effective To Interact With Various Job Portals E.G. Linkedin, Dice, Monster, Indeed Etc. Work Efficiently With High Volume Of Resource Requests In A Competitive Environment With The Ability To Adapt Focus Of Searches As Needed. Good Oral, Written, And Interpersonal Communication Skills. Connecting And Networking On Linked In, Tech Groups, Employees, Etc. For Market Intelligence And Lead Generation Good Knowledge About US Geography If Selected By Edgesys, Candidate Should Not Be Working In Any Other Full Time Job During The Normal Work Timings. Should Possess A Personal Computer With An Internet Connection. Job Types: Full-time, Permanent Pay: ₹14,506.82 - ₹80,000.00 per month Benefits: Provident Fund Work from home Experience: Bench Sales: 4 years (Preferred) Work Location: In person
Posted 1 week ago
0 years
4 - 5 Lacs
Calcutta
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Vice President – Head of Master Data Management (MDM) . In this role, you will serve as the strategic leader responsible for driving the vision, strategy, governance, and delivery of enterprise-wide Master Data Management initiatives. As the MDM Lead, you will work closely with business and technology stakeholders to ensure consistent, accurate , and trusted master data across domains such as Customer, Product, Supplier, and Finance, enabling data-driven decision-making and operational efficiency. Responsibilities Define and lead the enterprise MDM strategy and roadmap in alignment with organizational goals and digital transformation initiatives. Establish and enforce data standards, governance frameworks, and stewardship models to ensure the quality, integrity, and security of master data. Oversee the design, implementation, and operations of MDM platforms across domains (e.g., Customer, Product, Supplier, Location). Lead the selection , deployment, and optimization of leading MDM tools such as Informatica MDM , Reltio , SAP MDG , Stibo STEP , and IBM InfoSphere MDM . Partner with business units and technology teams to integrate MDM into enterprise systems, including ERP, CRM, and analytics platforms. Drive enterprise-wide initiatives to cleanse, enrich, and harmonize master data across business functions and geographies. Promote a data-driven culture by embedding MDM into business processes, analytics, and digital programs. Lead teams of data architects, analysts, and engineers to deliver scalable and sustainable MDM solutions. Define KPIs and implement monitoring frameworks to track data quality, operational efficiency, and business impact. Provide executive-level updates on MDM performance, maturity, and contribution to business outcomes. Ensure MDM solutions comply with regulatory, security, and privacy standards (e.g., GDPR, HIPAA, SOX). Qualifications We Seek in You! Minimum Qualifications / Skills Bachelor's degree in Computer Science , Engineering, Information Management, or a related field (Master’s or MBA preferred). experience in data management and enterprise architecture, with MDM leadership roles. Proven expertise in leading large-scale MDM implementations across global, complex organizations. Deep hands-on knowledge of top MDM platforms such as Informatica MDM , Reltio , SAP Master Data Governance, IBM InfoSphere MDM etc. Experience designing and delivering multi-domain MDM architectures . Strong understanding of data governance, data quality management, and data lifecycle processes. Successful track record working across business, IT, and compliance teams. Preferred Qualifications / Skills Excellent leadership, stakeholder management, and communication skills with the ability to influence senior executives. Familiarity with cloud-based MDM solutions and integration with platforms like AWS, Azure, Salesforce, and Snowflake. Knowledge of regulatory compliance and security frameworks relevant to master data. Experience in industries such as Banking, Insurance, Healthcare, Manufacturing, or Consumer Goods. Industry-recognized certifications (e.g., CDMP – Certified Data Management Professional , DAMA DMBOK). Understanding of Agile, DevOps, and SAFe practices for data and platform delivery. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Vice President Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 24, 2025, 4:28:18 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures Of Outcomes TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Code Outputs Expected: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure Define and govern the configuration management plan. Ensure compliance from the team. Test Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Execute and monitor the release process. Design Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface With Customer Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications Obtain relevant domain and technology certifications. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory Skills Azure Data Factory,Azure Databricks,Pyspark,Sql
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
We are seeking a talented individual to join our Metrics, Analytics & Reporting team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Manager - Metrics, Analytics & Reporting ( Scrum Master) We will count on you to: Promoting Agile principles and practices across teams, ensure Agile / Scrum concepts and principles are adhered to, and where necessary coach the teams in implementing and practicing Agile principles. Acting as a bridge between development teams and stakeholders. Foster a culture of trust, collaboration, and accountability. Organize, and facilitate Scrum ceremonies for Scrum teams. Track Scrum metrics including team velocity and sprint / release progress and communicate this internally and externally, improving transparency Help and coach the product owner to establish and enforce sprint priorities and release delivery deadlines. Ensure business objectives are understood and achieved by as per sprint commitments. Identifying and removing obstacles to team progress. Prevent distractions that interfere with the ability of the team to deliver the sprint goals, through mediation, arbitration, mitigation and addressing impediments with the team members and the organizational hierarchy. Enabling self-organizing, cross-functional teams. Ensure DOR is met for all prioritized requirements. Encourage DOD and the importance of Driving a collaborative and supportive team culture through team building and engagement practices. Drive continuous improvement through team retrospectives and facilitating process enhancements. Identify and resolve conflicts, promote constructive dialogue, and encourage innovation. Work closely with other Scrum Masters to align cross-team dependencies and best practices. What you need to have: 6+ years of experience as a Scrum Master in a distributed Agile team with CSM or equivalent certification. Solid understanding of Agile frameworks (Scrum, Kanban, SAFe, etc.). Proficiency in Jira/Confluence and Azure Dev Ops and familiarity with different Agile practices such as Kanban/Lean. Proven track record of being a servant/leader in a Scrum team, driving teams and removing blockers, and improving processes through retrospectives. Strong facilitation, conflict resolution, and mentoring skills. Ability to assist technical team members and senior non-technical product owners in making appropriate decisions (Stakeholder Management). Comfortable with responsibility for delivering results and resilient enough to handle pressure in balancing time, quality, and scope. Proven ability to coach and mentor others, positive approach to complex problems, and a can-do attitude. Assertive and fact-based communicator, able to explain technical issues to a business audience and vice versa. Experience as a self-starter in a rapidly evolving and ambiguous environment, continuously learning and problem-solving quickly. Ability to identify and articulate risks and constructively challenge assumptions. Strong team player with Influencing and negotiation skills in a virtual/remote environment, working with customers/ developers across the globe. Excellent communication and interpersonal skills. Experience working with distributed or hybrid teams. What makes you stand out? Understanding of the Data Quality domain and experience in delivering KPI dashboards Track record of successful Agile transformations or scaling initiatives Strong analytical mindset with a data-driven approach to problem-solving. Exposure to solutions such as SQL, QlikView, Qlik Sense, Informatica DQ , Power BI Strong insurance and / or insurance broking business domain knowledge SAFE 6 Certification would be a big Plus. Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Marsh, a business of Marsh McLennan (NYSE: MMC), is the world’s top insurance broker and risk advisor. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marsh.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one “anchor day” per week on which their full team will be together in person. R_308504
Posted 1 week ago
3.0 - 7.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Roles and Responsibility Design and develop interactive dashboards using Power BI to provide data-driven insights. Collaborate with stakeholders to understand business requirements and develop solutions. Develop and maintain databases, data models, and ETL processes to support reporting needs. Create reports, visualizations, and analytics to drive business decisions. Troubleshoot issues and optimize performance for improved efficiency. Work closely with cross-functional teams to ensure seamless integration of Power BI solutions. Job Requirements Strong understanding of data modeling, database design, and ETL concepts. Proficiency in developing complex queries and writing efficient code. Excellent communication skills to effectively collaborate with stakeholders. Ability to work independently and as part of a team to deliver high-quality results. Strong problem-solving skills to analyze complex issues and develop creative solutions. Experience working with large datasets and performing data analysis to drive business insights.
Posted 1 week ago
7.0 - 12.0 years
15 - 20 Lacs
Bengaluru
Hybrid
Role & responsibilities - 7 years of experience in modeling and business system designs. - 5 years hands on experience in SQL and Informatica ETL development is must. - 3 years of Redshift or Oracle (or comparable database) experience with BI/DW deployments. - Must have proven experience with STAR and SNOWFLAKE schema techniques. - Development experience in minimum 1 year in Python scripting is mandatory. Having Unix scripting is an added advantage - Proven track record as an ETL developer in delivering successful business intelligence developments with complex data sources. - Strong analytical skills and enjoys solving complex technical problems.
Posted 1 week ago
15.0 - 20.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Your future role Take on a new challenge and apply your engineering and project management expertise in a cutting-edge field. Youll work alongside a collaborative and dynamic team of professionals. You'll play a pivotal role in driving operational governance, supporting tender management, and executing workload conversion strategies. Day-to-day, youll work closely with teams across the business (such as site engineering leaders, sourcing, industrial, and installation representatives), develop governance methodologies, and oversee the execution of services development strategies. Youll specifically take care of managing technical scopes end-to-end (ISR-ISC) and ensuring the delivery of technically robust and cost-competitive proposals, but also developing clear and concise presentations for senior management. Well look to you for: Ensuring smooth operation of service domains by introducing and driving operational governance methodologies Services development strategy support, Tender Management Developing and maintaining systems for tracking action items and deadlines Overseeing internal dashboards and ensuring alignment of workload conversion strategies with business goals Managing a team to deliver tenders end-to-end, ensuring quality, cost, and delivery commitments Analyzing gaps between customer specifications and standard solutions to define compliance strategies Coordinating technical stakeholders and consolidating technical documentation for bid preparation Providing engineering effort estimates for bid preparations Engaging with domain teams to ensure our efforts align with the overall business goals and service delivery standards. Focusing on initiatives that yield tangible business benefits, ensuring that our services not only meet operational requirements but also enhance overall business performance. LCC governance, BCC,E2E ISR-ISC Analyze, with the support of System Application Architect(s), the gaps between the customer's specification and standard solutions and products and support BTM and Tender Leader in defining the most suitable compliance strategy. Consolidate the technical assumptions applicable to the bid Involved in performance measurements Services development strategy governance -Drive E2E governance ISR to ISC Simple /Clarity in communication/presentation to depict Services development strategy progress Data driven governance Clear digital dashboard, Action trackers, MoM OTIF, workload target, forecast, actual hours, right first time, budget adherence, diversity index, CFB score, PO and Invoices on-time. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: University degree in Railway Engineering, Electronics, or Electrical and Mechanical Engineering 15+ years of experience in the engineering domain 3+ years of project management or relevant experience Experience leading cross-functional teams Knowledge of rolling stock equipment and tender/project management Proficiency in MS Office tools and BI applications Strong presentation and communication skills Capacity for managing technical risks and problem-solving Attention to detail and ability to work independently Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects Utilise our flexible and inclusive working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning Progress towards leadership or specialized roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension)
Posted 1 week ago
15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Title: Cloud Solutions Practice Head Location: Hyderabad, India (Travel as Needed) Reports To: CEO / Executive Leadership Team Employment Type: Full-Time | Senior Leadership Role Industry: Information Technology & Services | Cloud Solutions | AI & Digital Transformation Join the Future of Enterprise Cloud At BPMLinks , we are building a cloud-first future for enterprise clients across the globe. As our Cloud Solutions Practice Head , you won’t just lead a team, you’ll shape a legacy. Position Overview: BPMLinks LLC is seeking an accomplished and visionary Cloud Solutions Practice Head to establish and lead our newly launched Cloud Solutions Practice , aligning cloud innovation with business value creation. This is a pivotal leadership role that will oversee the full spectrum of cloud consulting, engineering, cost optimization, migration, and AI/ML-enabled services across our global delivery portfolio. The ideal candidate is a cloud thought leader with deep expertise across AWS, Azure, GCP , and modern data platforms (e.g., Snowflake, Databricks, Azure Data Factory, Oracle ). You will play a key role in scaling multi-cloud capabilities, building high-performing teams, and partnering with clients to drive cost efficiency, performance, security, and digital innovation. Key Responsibilities: 🔹 Practice Strategy & Leadership Define and execute the vision, roadmap, and service catalog for the Cloud Solutions Practice. Build a world-class delivery team of cloud architects, engineers, DevOps professionals, and data specialists. Align the practice’s capabilities with BPMLinks’ broader business transformation initiatives. 🔹 Cloud & Data Architecture Oversight Lead the design and deployment of scalable, secure, cost-optimized cloud solutions on AWS, Azure, and GCP. Direct complex cloud and data migration programs , including: Transitioning from legacy systems to Snowflake, Databricks, and BigQuery Data pipeline orchestration using Azure Data Factory, Airflow, Informatica Modernization of Oracle and SQL Server environments Guide hybrid cloud and multi-cloud strategies across IaaS, PaaS, SaaS, and serverless architectures. 🔹 Cloud Cost Optimization & FinOps Leadership Architect and institutionalize cloud cost governance frameworks and FinOps best practices. Leverage tools like AWS Cost Explorer, Azure Cost Management, and third-party FinOps platforms. Drive resource rightsizing, workload scheduling, RIs/SPs adoption, and continuous spend monitoring. 🔹 Client Engagement & Solution Delivery Act as executive sponsor for strategic accounts, engaging CXOs and technology leaders. Lead cloud readiness assessments, transformation workshops, and solution design sessions. Ensure delivery excellence through agile governance, quality frameworks, and continuous improvement. 🔹 Cross-Functional Collaboration & Talent Development Partner with sales, marketing, and pre-sales teams to define go-to-market strategies and win pursuits. Foster a culture of knowledge sharing, upskilling, certification, and technical excellence. Mentor emerging cloud leaders and architects across geographies. Cloud Services Portfolio You Will Lead: Cloud Consulting & Advisory Cloud readiness assessments, cloud strategy and TCO analysis Multi-cloud and hybrid cloud governance, regulatory advisory (HIPAA, PCI, SOC2) Infrastructure, Platform & Application Services Virtual machines, networking, containers, Kubernetes, serverless computing App hosting, API gateways, orchestration, cloud-native replatforming Cloud Migration & Modernization Lift-and-shift, refactoring, legacy app migration Zero-downtime migrations and DR strategies Data Engineering & Modern Data Platforms Snowflake, Databricks, BigQuery, Redshift Azure Data Factory, Oracle Cloud, Informatica, ETL/ELT pipelines DevOps & Automation CI/CD, Infrastructure-as-Code (Terraform, CloudFormation, ARM) Release orchestration and intelligent environment management Cloud Security & Compliance IAM, encryption, CSPM, SIEM/SOAR, compliance audits and policies Cost Optimization & FinOps Reserved instances, spot instances, scheduling automation Multi-cloud FinOps dashboards, showback/chargeback enablement AI/ML & Analytics on Cloud Model hosting (SageMaker, Vertex AI, Azure ML), RAG systems, semantic vector search Real-time analytics with Power BI, Looker, Kinesis Managed Cloud Services 24/7 monitoring (NOC/SOC), SLA-driven support, patching, DR management Training & Enablement Certification workshops, cloud engineering training, CoE development Required Qualifications: 15+ years of experience in enterprise IT and cloud solutions, with 5+ years in senior leadership roles Expertise in AWS, Azure, GCP (certifications preferred) Proven success in scaling cloud practices or large delivery units Hands-on experience with data platforms: Snowflake, Databricks, Azure Data Factory, Oracle In-depth understanding of FinOps principles, cost governance, and cloud performance tuning Excellent executive-level communication, strategic thinking, and client-facing presence Preferred Qualifications: Experience serving clients in regulated industries (healthcare, finance, public sector) Strong commercial acumen with experience in pre-sales, solutioning, and deal structuring MBA or advanced degree in Computer Science, Engineering, or Technology Management What We Offer: Opportunity to define and scale a global Cloud Practice from the ground up Direct influence on innovation, customer impact, and company growth Collaboration with a forward-thinking executive team and top-tier AI engineers Competitive compensation, performance-linked incentives, and potential equity Culture of ownership, agility, and continuous learning
Posted 1 week ago
6.0 years
0 Lacs
Delhi, India
Remote
Job Title: Senior Data Modeler Experience Required: 6+ Years Location: Remote Employment Type: Full-time / Contract (Remote) Domain: Data Engineering / Analytics / Data Warehousing --- Job Summary: We are seeking an experienced and detail-oriented Data Modeler with a strong background in conceptual, logical, and physical data modeling. The ideal candidate will have in-depth knowledge of Snowflake architecture, data modeling best practices (Star/Snowflake schema), and advanced SQL scripting. You will be responsible for designing robust, scalable data models and working closely with data engineers, analysts, and business stakeholders. --- Key Responsibilities: 1. Data Modeling: Design Conceptual, Logical, and Physical Data Models. Create and maintain Star and Snowflake Schemas for analytical reporting. Perform Normalization and Denormalization based on performance and reporting requirements. Work closely with business stakeholders to translate requirements into optimized data structures. Maintain data model documentation and data dictionary. 2. Snowflake Expertise: Design and implement Snowflake schemas with optimal partitioning and clustering strategies. Perform performance tuning for complex queries and storage optimization. Implement Time Travel, Streams, and Tasks for data recovery and pipeline automation. Manage and secure data using Secure Views and Materialized Views. Optimize usage of Virtual Warehouses and storage costs. 3. SQL & Scripting: Write and maintain Advanced SQL queries including: Common Table Expressions (CTEs) Window Functions Recursive queries Build automation scripts for data loading, transformation, and validation. Troubleshoot and optimize SQL queries for performance and accuracy. Support data migration and integration projects. --- Required Skills & Qualifications: 6+ years of experience in Data Modeling and Data Warehouse design. Proven experience with Snowflake platform (min. 2 years). Strong hands-on experience in Dimensional Modeling (Star/Snowflake schemas). Expert in SQL and scripting for automation and performance optimization. Familiarity with tools like Erwin, PowerDesigner, or similar data modeling tools. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. --- Preferred Skills (Nice to Have): Experience with ETL/ELT tools like dbt, Informatica, Talend, etc. Exposure to Cloud Platforms like AWS, Azure, or GCP. Familiarity with Data Governance and Data Quality frameworks.
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Impact: Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset. Executes and provides feedback for data modeling policies, procedure, processes, and standards. Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems. Develop comprehensive data quality standards and implement effective tools to ensure data accuracy and reliability. Collaborate with various Investment Management departments to gain a better understanding of new data patterns. Collaborate with Data Analysts, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. Translate high-level business requirements into detailed technical specs. The Minimum Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems or related field. Experience: 7-9 years of experience with data analytics, data modeling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau).
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job description Schneider Electric is looking for AWS data cloud engineer with min experience of 5 years in AWS data lake implementation. Responsible for creating/managing data ingestion, transformation, making data ready for consumption in analytical layer of data lake. Also responsible for managing/monitoring data quality of data lake using informatica power center. Also responsible for creating dashboards from analytical layer of data lake using Tableau or Power BI. Your Role We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your responsibilities are: Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools. Implement data quality rules, perform data profiling to assess the source data quality, identify data anomalies, and create data quality scorecards using Informatica PowerCenter. Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making. Interact with product owners to understand the needs of data ingestion, data quality rules. Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes. Optional skill. Qualifications Your Skills and Experience Min of 3 to 5 years of experience in AWS Data Lake implementation. Min of 2 to 3 years of knowledge in Informatica PowerCenter. Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR, Amazon Athena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow. Understanding of relational databases like Oracle, SQL Server, MySQL Programming Skills: Strong experience with modern programming languages such as Python and Java. Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop. Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Expertise in developing Business Intelligence dashboards in Tableau, Power BI is a plus. Good knowledge on project and portfolio management suite of tools is a plus. Should be well versed with Agile principle of implementation. Having Safe Agile principles is a plus. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Help us deliver solutions that ensure Life Is On everywhere, for everyone and at every moment: https://youtu.be/NlLJMv1Y7Hk. Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. See what our people have to say about working for Schneider Electric: https://youtu.be/6D2Av1uUrzY Our EEO statement : Schneider Electric aspires to be the most inclusive and caring company in the world, by providing equitable opportunities to everyone, everywhere, and ensuring all employees feel uniquely valued and safe to contribute their best. We mirror the diversity of the communities in which we operate and we ‘embrace different’ as one of our core values. We believe our differences make us stronger as a company and as individuals and we are committed to championing inclusivity in everything we do. This extends to our Candidates and is embedded in our Hiring Practices. You can find out more about our commitment to Diversity, Equity and Inclusion here and our DEI Policy here Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
panchkula, haryana
On-site
We are seeking a skilled and experienced Lead/Senior ETL Engineer with 4-8 years of experience to join our dynamic data engineering team. As a Lead/Sr. ETL Engineer, you will play a crucial role in designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. Your expertise in ETL tools, cloud platforms, scripting, and data modeling principles will be pivotal in building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills: - Proficiency in ETL tools such as SSIS, DataStage, Informatica, or Talend. - In-depth understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, and Fact & Dimension tables. - Strong experience with relational databases like SQL Server, Oracle, Teradata, DB2, or MySQL. - Solid scripting/programming skills in Python. - Hands-on experience with cloud platforms like AWS or Azure. - Knowledge of middleware architecture and enterprise data integration strategies. - Familiarity with reporting/BI tools such as Tableau and Power BI. - Ability to write and review high and low-level design documents. - Excellent communication skills and the ability to work effectively with cross-cultural, distributed teams. Roles and Responsibilities: - Design and develop ETL workflows and data integration strategies. - Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. - Coach and mentor junior engineers to support skill development and performance. - Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. - Participate in planning, estimations, and recruitment activities. - Work on multiple projects simultaneously, ensuring quality and consistency in delivery. - Experience in Sales and Marketing data domains. - Strong problem-solving abilities with a data-driven mindset. - Ability to work independently and collaboratively in a fast-paced environment. - Prior experience in global implementations and managing multi-location teams is a plus. If you are a passionate Lead/Sr. ETL Engineer looking to make a significant impact in a dynamic environment, we encourage you to apply for this exciting opportunity. Thank you for considering a career with us. We look forward to receiving your application! For further inquiries, please contact us at careers@grazitti.com. Location: Panchkula, India,
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting Oracle Analytics Cloud Senior Consultant The opportunity We're looking for Staff Consultant with expertise in Oracle Analytics Cloud to join the EA group of our consulting Team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. We are seeking experienced and motivated Engineer with a strong track record of Oracle Analytics Cloud, business analytics, and data warehousing to join our team providing deep technical expertise in Analytics, Business Intelligence, Data Warehouse, ETL and about power & utility Sector. Our Engineers work closely with external clients, presales, other architects, and internal teams to design, build and enable solutions on different Analytics Solutions. This role demands a highly technical, extremely hands-on cloud engineer who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on Oracle Analytics, ETL & Data Warehouse. The ideal candidate must have a good understanding of the value of data and analytics and proven experience in delivering solutions to different lines of business and technical leadership. From top to bottom approach, Candidate will engage with a customer to discover business problems and goals, and develop solutions using different cloud services. Your Key Responsibilities - Expertise in Oracle's analytics offerings: Oracle Analytics Cloud, Data Visualization, OBIEE, Fusion Analytics for Warehouse - Have solution design skills to provide expertise and guide customers for specific needs. - Be extremely hands on with Analytics and Data Warehousing report/solution development. - Deliver PoCs tailored to customers" needs. - Run and Deliver Customer Hands-on Workshops. - Interact with all roles at customer, including Executives, Architects, technical staff and business representatives. - Building Effective Relationships, Customer Focus, Effective Communication and Coaching Skills And Attributes For Success - Primary focus on developing customer solutions using Oracle's analytics offerings: Oracle Analytics Cloud, Data Visualization, Fusion Analytics for Warehouse, OBIEE, OBIA, etc. - Must have extensive hands-on/end-to-end implementation experience using OAC/OBIEE and BI Publisher. - Knowledge in the development of Oracle BI Repository (RPD). - Experience in configuring OBIEE /OAC security (Authentication and Authorization Object level and Data level security) as well as tuning reports. - Experience in working with session and repository variable and initialization blocks to streamline administrative tasks and modify metadata content dynamically. - Experience on working with report performance optimization. - Experience in developing Dimensional Hierarchies and adding multiple sources to business model objects. - Solid knowledge of data extraction using SQL. - Good knowledge of Oracle Applications Oracle E-business Suite or Oracle ERP, Oracle HCM (the Oracle cloud SaaS offering) is preferable. - Deep knowledge of Database, Cloud Concepts, Autonomous Data warehouse (ADW), Data Integration tools such as ODI, Informatica, etc. is an added advantage To qualify for the role, you must have - 2-7 years of Data warehousing and Business Intelligence projects experience. - Having 2-7 years of projects experience on OBIEE. - Having at least 2-7 years of OAC implementations experience. - Worked on Financial, SCM, or HR Analytics. Ideally, you'll also have - Engaging with business partners and IT to understand requirements from various parts of an organization to drive the design, programming execution, and UAT for future state capabilities within the platform. - Working in a fast-paced and dynamic environment while managing multiple projects and strict deadlines. - Good understanding of outsourcing and offshoring, building win/win strategies and contracts with suppliers. - Experience on other Data visualization tools like Power BI or Tableau would be a plus. - Experience or good knowledge of Oracle Applications Oracle CC&B, Oracle MDM, etc. - Integration development to/from other systems What We Look For - Consulting experience, including assessments and implementations. - Documenting requirements and processes (e.g., process flows). - Working collaboratively in a team environment. - Excellent oral and written communication skills. - Strong analytical and problem-solving skills. - B.E. / B.tech. /Masters degree required. What Working At EY Offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At PwC, we specialize in providing consulting services for a variety of business applications to help clients optimize their operational efficiency. As a Salesforce consulting generalist, you will have a broad range of consulting skills and experience across various Salesforce applications. Your role will involve analyzing client needs, implementing software solutions, and offering training and support for effective utilization of Salesforce applications, enabling clients to achieve their strategic objectives. You will be a reliable and contributing member of a team, adapting to working with a variety of clients and team members in a fast-paced environment. Your curiosity and willingness to learn will be key as every experience presents an opportunity for growth. Taking ownership and consistently delivering quality work to drive value for our clients and success as a team will be expected from you. Building a brand for yourself within the Firm will open doors to more opportunities. Key Responsibilities: - Analyze client needs and implement software solutions for Salesforce applications - Provide training and support for effective utilization of Salesforce applications - Assist clients in optimizing operational efficiency and achieving their strategic objectives - Adapt to working with different clients and team members in a fast-paced environment - Take ownership of tasks and consistently deliver high-quality work - Build a personal brand within the Firm to create more opportunities for growth Skills and Qualifications: - 4+ years of total IT experience with 4+ years of SFDC experience - Extensive experience in Force.com platform using APEX and Visualforce - Solid implementation experience using Sales / Service / Custom cloud - Proficiency in HTML, CSS, Ajax, JavaScript, and JQuery - Experience with Field Service Lightning tool configuration - Hands-on customization experience in APEX, Visualforce, Workflow/Process Builder, Triggers, Batch, and Schedule Apex - Additional desired skills include knowledge in Object-Oriented programming, Bootstrap, Angular JS, Lightning design components, marketing tools like Marketing Cloud, and products like Apttus, Veeva, nCino, Adobe Flex - Ability to handle data management, understand technical processes strategically, and recommend appropriate solutions - Enthusiastic about code honesty, modularity, cleanliness, and version control - Familiarity with integration platforms and ability to translate customer requirements into functional Salesforce configurations Educational Qualifications: - BE / B Tech / MCA/ M.Sc / M.E / M.Tech Job Title: Salesforce Lightning, LWC Developer Job Level: Sr. Associate,
Posted 1 week ago
6.0 years
12 - 18 Lacs
Delhi, India
Remote
Skills: Data Modeling, Snowflake, Schemas, Star Schema Design, SQL, Data Integration, Job Title: Senior Data Modeler Experience Required: 6+ Years Location: Remote Employment Type: Full-time / Contract (Remote) Domain: Data Engineering / Analytics / Data Warehousing Job Summary We are seeking an experienced and detail-oriented Data Modeler with a strong background in conceptual, logical, and physical data modeling. The ideal candidate will have in-depth knowledge of Snowflake architecture, data modeling best practices (Star/Snowflake schema), and advanced SQL scripting. You will be responsible for designing robust, scalable data models and working closely with data engineers, analysts, and business stakeholders. Key Responsibilities Data Modeling: Design Conceptual, Logical, and Physical Data Models. Create and maintain Star and Snowflake Schemas for analytical reporting. Perform Normalization and Denormalization based on performance and reporting requirements. Work closely with business stakeholders to translate requirements into optimized data structures. Maintain data model documentation and data dictionary. Snowflake Expertise: Design and implement Snowflake schemas with optimal partitioning and clustering strategies. Perform performance tuning for complex queries and storage optimization. Implement Time Travel, Streams, and Tasks for data recovery and pipeline automation. Manage and secure data using Secure Views and Materialized Views. Optimize usage of Virtual Warehouses and storage costs. SQL & Scripting: Write And Maintain Advanced SQL Queries Including Common Table Expressions (CTEs) Window Functions Recursive queries Build automation scripts for data loading, transformation, and validation. Troubleshoot and optimize SQL queries for performance and accuracy. Support data migration and integration projects. Required Skills & Qualifications 6+ years of experience in Data Modeling and Data Warehouse design. Proven experience with Snowflake platform (min. 2 years). Strong hands-on experience in Dimensional Modeling (Star/Snowflake schemas). Expert in SQL and scripting for automation and performance optimization. Familiarity with tools like Erwin, PowerDesigner, or similar data modeling tools. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Preferred Skills (Nice To Have) Experience with ETL/ELT tools like dbt, Informatica, Talend, etc. Exposure to Cloud Platforms like AWS, Azure, or GCP. Familiarity with Data Governance and Data Quality frameworks.
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: - Apply a learning mindset and take ownership for your own development. - Appreciate diverse perspectives, needs, and feelings of others. - Adopt habits to sustain high performance and develop your potential. - Actively listen, ask questions to check understanding, and clearly express ideas. - Seek, reflect, act on, and give feedback. - Gather information from a range of sources to analyse facts and discern patterns. - Commit to understanding how the business works and building commercial awareness. - Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role Title: Middleware Integration Associate Job Level: Associate Department: Middleware Integration Location: As applicable Reports to: Delivery Manager / Project Lead Role Summary: The Associate role in Middleware Integration is responsible for supporting the design, development, and maintenance of integration solutions using SAP PI/PO, Informatica, SLT, and BODS. The Associate will assist in building and monitoring interfaces across SAP and non-SAP systems, under the guidance of senior team members. This role is ideal for individuals with a foundational understanding of integration tools, who are eager to grow their technical expertise in enterprise data integration. Key Responsibilities: - Support the development of middleware interfaces using SAP PI/PO and Informatica. - Assist in ETL operations using BODS and SLT for real-time and batch data movement. - Monitor interface performance and raise alerts for anomalies or failures. - Document technical specifications, test cases, and support procedures. - Participate in unit testing and coordinate with QA for defect resolution. - Collaborate with application teams to understand integration requirements. - Support incident management and troubleshooting efforts. - Work under guidance of senior developers and managers to learn and grow. Qualifications And Skills: - 1-3 years of experience in middleware or data integration roles. - Basic understanding of SAP PI/PO and Informatica. - Exposure to ETL tools such as SAP SLT and BODS. - Familiarity with XML, IDoc, and REST/SOAP-based services. - Strong analytical and problem-solving skills. - Good communication and willingness to learn enterprise systems. Organization Fit: The ideal candidate is detail-oriented, proactive, and collaborative. They thrive in a dynamic team environment and are passionate about building scalable and reliable integration systems. Adaptability, continuous learning, and effective communication are key traits for success in this role.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France