Jobs
Interviews

5614 Informatica Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title : Data Engineering Subject Matter Expert (SME) Location : Dubai, UAE (Hybrid/Onsite) Experience : 10+ years in Data Engineering and ETL with proven leadership and solution delivery experience Job Summary We are seeking a seasoned Data Engineering SME with strong experience in data platforms, ETL tools, and cloud technologies. The ideal candidate will lead the design and implementation of enterprise-scale data solutions, provide strategic guidance on data architecture, and play a key role in data migration, data quality, and performance tuning initiatives. This role demands a mix of deep technical expertise, project management, and stakeholder communication. Key Responsibilities Lead the design, development, and deployment of robust, scalable ETL pipelines and data solutions. Provide technical leadership and SME support for data engineering teams across multiple projects. Collaborate with cross-functional teams including Data Analysts, BI Developers, Product Owners, and IT to gather requirements and deliver data products. Design and optimize data workflows using tools such as IBM DataStage, Talend, Informatica, and Databricks. Implement data integration solutions for structured and unstructured data across on-premise and cloud platforms. Conduct performance tuning and optimization of ETL jobs and SQL queries. Oversee data quality checks, data governance compliance, and PII data protection strategies. Support and mentor team members on data engineering best practices and agile methodologies. Analyze and resolve production issues in a timely manner. Contribute to enterprise-wide data transformation strategies including legacy-to-digital migration using Spark, Hadoop, and cloud platforms. Manage stakeholder communications and provide regular status reports. Required Skills And Qualifications Bachelor's degree in Engineering, Computer Science, or a related field (MTech in Data Science is a plus). 10+ years of hands-on experience in ETL development and data engineering. Strong proficiency with tools : IBM DataStage, Talend, Informatica, Databricks, Power BI, Tableau. Strong SQL, PL/I, Python, and Unix Shell scripting skills. Experience with cloud platforms like AWS and modern big data tools like Hadoop, Spark. Solid understanding of data warehousing, data modeling, and data migration practices. Experience working in Agile/Scrum environments. Excellent problem-solving, communication, and team collaboration skills. Scrum Master or Product Owner certifications (CSM, CSPO) are a plus (ref:hirist.tech)

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Full Stack Developer with a focus on Database (DB) and ETL experience, you will be joining an ambitious and ground-breaking project at DWS in Pune, India. The Project Proteus aims to transform DWS into an Asset Management standalone operating platform by moving infrastructure and corporate functions to the cloud. This role offers a unique opportunity to contribute to the strategic future state technology landscape for DWS Corporate Functions globally. Your responsibilities will include designing, developing, and implementing custom solutions, extensions, and integrations in a cloud-first environment. You will work closely with functional teams and business stakeholders, providing support to US business stakeholders & regulatory reporting processes during US hours. Your role will involve a significant technical aspect but also include team handling, mentoring, and status reporting activities. To excel in this role, you should have a Bachelor's Degree with a concentration in Science or an IT-related discipline, along with a minimum of 10 years of IT industry experience. Proficiency in Informatica or any ETL tool, hands-on experience with Oracle SQL/PL SQL, exposure to PostgreSQL, Cloud/Big Data technology, CI/CD tools like Team City or Jenkins, GitHub, UNIX commands, and Control-M scheduling tool are essential. Experience working in an Agile/Scrum software development environment is preferred. Your key responsibilities will encompass creating software designs, hands-on code development, testing, mentoring junior team members, code reviews, managing daily stand-up meetings, articulating issues to management, analyzing and fixing software defects, and collaborating with stakeholders and other teams. Your excellent analytical capabilities, communication skills, problem-solving abilities, and attention to detail will be crucial for success in this role. Nice-to-have skills include exposure to PySpark, React JS, or Angular JS, as well as experience in automating the build process for production using scripts. Training, development, coaching, and support from experts in your team will be provided to help you excel in your career within a culture of continuous learning and progression at Deutsche Bank Group. For more information about Deutsche Bank and its teams, please visit our company website at https://www.db.com/company/company.htm. We promote a positive, fair, and inclusive work environment and welcome applications from all individuals who share our values and vision.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. We're counting on your unique voice and perspective to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. As a Staff Consultant specializing in Oracle Analytics Cloud, you will have the opportunity to be part of the EA group of our consulting team at EY. This role offers a chance to work with a leading firm and play a key role in the growth of a new service offering. We are looking for an experienced and motivated Engineer with a strong background in Oracle Analytics Cloud, business analytics, and data warehousing to join our team. Your responsibilities will include providing technical expertise in Analytics, Business Intelligence, Data Warehouse, ETL, and power & utility sectors. You will collaborate closely with external clients, presales, architects, and internal teams to design, build, and implement solutions on various Analytics platforms. The ideal candidate for this role is a highly technical and hands-on cloud engineer who will collaborate with EY Partners and external clients to develop new business opportunities and drive initiatives related to Oracle Analytics, ETL, and Data Warehouse. You must have a deep understanding of the value of data and analytics, along with a proven track record of delivering solutions to different lines of business and technical leadership. Your role will involve engaging with customers to identify business problems and goals, and developing solutions using a range of cloud services. Key Responsibilities: - Expertise in Oracle's analytics offerings, including Oracle Analytics Cloud, Data Visualization, OBIEE, and Fusion Analytics for Warehouse - Solution design skills to guide customers for their specific needs - Hands-on experience in Analytics and Data Warehousing report/solution development - Delivering PoCs tailored to customers" requirements - Conducting Customer Hands-on Workshops - Building effective relationships with customers at all levels - Skills and Attributes for Success: - Focus on developing customer solutions using Oracle's analytics offerings - Exposure to other BI tools like Power BI or Tableau - Familiarity with Cloud environments like Azure or AWS, or experience with ETL tools is advantageous - Extensive hands-on experience with OAC/OBIEE and BI Publisher - Knowledge of developing Oracle BI Repository (RPD) and configuring OBIEE/OAC security - Experience in report performance optimization, Dimensional Hierarchies, and data extraction using SQL - Good understanding of Oracle Applications, such as Oracle E-business Suite or Oracle ERP - Knowledge of Database, Cloud Concepts, and Data Integration tools like ODI and Informatica Qualifications: - 2-5 years of experience in Data warehousing and Business Intelligence projects - 2-5 years of project experience with OBIEE - At least 2 years of OAC implementation experience - Experience working on Financial, SCM, or HR Analytics Preferred Qualifications: - Experience in engaging with business partners and IT for design and programming execution - Ability to work in a fast-paced environment with multiple projects and strict deadlines - Understanding of outsourcing and offshoring, with experience in building strategies with suppliers - Familiarity with Data visualization tools like Power BI or Tableau - Knowledge of Oracle Applications like Oracle CC&B and Oracle MDM - Experience in integration development with other systems What We Offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - Freedom and flexibility to shape your role according to your preferences At EY, we are dedicated to building a better working world by creating long-term value for clients, people, and society. Our diverse teams across 150 countries provide trust through assurance and help clients grow, transform, and operate. Join us in our mission to ask better questions and find new answers for the complex issues facing our world today.,

Posted 3 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are currently seeking SnowFlake Professionals with at least 12+ years of experience in the following areas: - Strong communication and proactive skills, ability to lead conversations - Experience architecting and delivering solutions on AWS - Hands-on experience with cloud warehouses like Snowflake - Strong knowledge of data integrations, data modeling (Dimensional & Data Vault), and visualization practices - Good understanding of data management (Data Quality, Data Governance, etc.) - Zeal to pick up new technologies, conduct PoCs, and present PoVs Technical Skills (Strong experience in at least one item in each category): - Cloud: AWS - Data Integration: Qlik Replicate, Snaplogic, Matillion & Informatica - Visualization: PowerBI & Thoughtspot - Storage & Databases: Snowflake, AWS Having certifications in Snowflake and Snaplogic would be considered a plus. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

kolkata, west bengal

On-site

As a candidate for this role, you should possess a total experience of 3-8 years and have a strong understanding of Salesforce development. Your responsibilities will include: - Demonstrating proficient knowledge of Apex, SOQL, and SOSL to develop custom solutions. - Utilizing asynchronous Apex for integrations, employing Batchable, Queueable, and Schedulable interfaces. - Adhering to Salesforce Governor Limits and implementing best practices in Apex programming. - Writing custom REST web services and SOAP services to enhance system functionality. - Familiarity with various Salesforce standard APIs and their utilization in integration tools like Mulesoft and Informatica. - Experience in developing Lightning Web Components (LWC) using Lightning base components and SLDS for styling. - Comprehensive understanding of Salesforce object-level and field-level security, and record sharing security model. - Creation of custom User Interfaces, including Visualforce pages, Aura Components, and Lightning Web Components. - Integration of Salesforce with other systems using Salesforce APIs. - Developing Apex classes and triggers, Visualforce pages based on specific business requirements. - Collaborating with Salesforce Administrators to validate business requirements and ensure considerations such as security, scalability, and limits are met. - Adhering to Salesforce best practices, maintaining code documentation, and writing/maintaining test classes for all custom development. Your expertise in Salesforce development and integration will play a crucial role in extending Salesforce to meet the organization's business requirements effectively.,

Posted 3 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Summary We are seeking a highly-skilled and experienced Marketing Cloud Testing team to join our team Marketing Automation team who works closely with brand teams; understands various data sources, adept in building data ingestion pipelines, skilled in testing end-to-end data ingestion layers, data models and visualization dashboards based on previously built test scripts. About the Role Key Responsibilities: Build e2e test scripts for each release based on user epics across the data value chain ingestion, dat a model and visualization Post development, run the test scripts using any of testing platforms viz Proton etc Document results and highlight any bugs / errors to development team and work closely with development team to resolve the issues Must audit technical developments and solutions and validate matching of source data with MCI Additional responsibilities may include creating and updating knowledge documents in the repository as needed. Work closely with Technical Lead and Business Analysts to help design testing strategy and testing design as part of pre-build activities Participate in data exploration and data mapping activities along with technical lead and business and DDIT architects for any new data ingestion needs from business along with Development team Build and maintain standard SOPs to run smooth operations that enable proper upkeep of visualization data and insights Qualifications: Minimum of 3-4 years of experience in Dataroma / MCI as hands on developer Prior experience in any of visualization platforms viz Tableau, Qlik, Power BI as core developer is a plus Experience of working on Data Cloud and other data platforms is a plus Hand-on experience in using any ETL tools such as Informatica, Alteryx, DataIKU preferred Prior experience in testing automation platforms preferred Excellent written and verbal skills. Strong interpersonal and analytical skills Ability to provide efficient, timely, reliable, and courteous service to customers. Ability to effectively present information Demonstrated knowledge of the Data Engineering & Business Intelligence ecosystem Salesforce MCI certification. Familiarity with AppExchange deployment, Flow, Aura component and Lightning Web component will be a plus. Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl. india@novartis. com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients lives. Ready to create a brighter future together? https://www. novartis. com / about / strategy / people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork. novartis. com/network Benefits and Rewards: Read our handbook to learn about all the ways we ll help you thrive personally and professionally:

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Kochi, Chennai, Thiruvananthapuram

Work from Office

" Sql,Data Analysis,Ms Excel,Dashboards ","description":" Experience Range : 3 to 7 years Hiring Locations : Chennai, Trivandrum, Kochi Role Description The role demands proficiency in designing and developing robust data pipelines using ETL tools and programming languages such as Python, PySpark, and SQL. The candidate will be responsible for coding, testing, and implementing data ingestion and transformation pipelines. Additionally, the role includes L1 Data Operations responsibilities, such as monitoring dashboards, identifying anomalies, and escalating issues as per standard operating procedures. Key Responsibilities Data Pipeline Development : Independently develop, test, and implement data processing pipelines. Use tools like Informatica, Glue, Databricks, and DataProc . Write clean, scalable, and optimized code in Python, PySpark, and SQL . Conduct thorough unit testing to ensure data accuracy and pipeline stability. Create clear documentation and maintain project artifacts. Data Operations (L1 Monitoring) : Monitor data pipelines, dashboards, and databases on a shift basis (24x7 support including night shifts). Identify, log, and escalate anomalies or failures using SOPs and runbooks. Execute basic SQL queries for data validation and issue resolution. Collaborate with L2\/L3 teams for escalation and root cause analysis. Maintain logs of incidents and escalations. Additional Responsibilities : Adhere to project timelines, SLAs, and compliance standards. Participate in estimation of effort and timelines for assigned work. Obtain foundational certifications in cloud platforms (Azure, AWS, or GCP). Contribute to knowledge management, documentation repositories, and release management processes. Mandatory Skills Proficiency in Python, PySpark, and SQL Experience with ETL tools like Informatica, AWS Glue, Databricks, or DataProc Strong understanding of data pipeline design and data wrangling Hands-on experience in cloud platforms \u2013 AWS, Azure, or GCP (especially with data services) Knowledge of data schemas, transformations, and models Strong ability to debug and test data processes and troubleshoot issues Good to Have Skills Familiarity with Apache Airflow, Talend, Azure ADF, or GCP DataFlow Certification in Azure\/AWS\/GCP data services Experience in production monitoring , L1 data ops support, and incident escalation Exposure to windowing functions in SQL and advanced Excel analysis Knowledge of Agile\/Scrum development processes Soft Skills Strong written and verbal communication skills Excellent analytical and problem-solving ability Ability to work independently with minimal supervision Keen attention to detail and precision in monitoring tasks Collaboration and coordination skills for working with cross-functional support teams Ability to multitask and remain calm in high-pressure, fast-paced environments Willingness to work in 24x7 shift schedules , including night shifts as required ","

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Noida

Work from Office

" Jubilant Bhartia Group Jubilant Bhartia Group is a global conglomerate founded by Mr. Shyam S Bhartia and Mr. Hari S Bhartia with strong presence in diverse sectors like Pharmaceuticals, Contract Research and Development Services, Proprietary Novel Drugs, Life Science Ingredients, Agri Products, Performance Polymers, Food Service (QSR), Food, Auto, Consulting in Aerospace and Oilfield Services. Jubilant Bhartia Group has four flagships Companies- Jubilant Pharmova Limited, Jubilant Ingrevia Limited, Jubilant FoodWorks Limited and Jubilant Industries Limited. Currently the group has a global workforce of around 43,000 employees. Jubilant Pharmova Limited Jubilant Pharmova Limited (formerly Jubilant Life Sciences Limited) is a company with global presence that is involved in Radiopharma, Allergy Immunotherapy, CDMO Sterile Injectables, Contract Research Development and Manufacturing Organisation (CRDMO), Generics and Proprietary Novel Drugs businesses. In the Radiopharma business, the Company is involved in manufacturing and supply of Radiopharmaceuticals with a network of 46 radio-pharmacies in the US. The Company s Allergy Immunotherapy business is involved in the manufacturing and supply of allergic extracts and venom products in the US and in some other markets such as Canada, Europe and Australia. Jubilant through its CDMO Sterile Injectables business offers manufacturing services including sterile fill and finish injectables (both liquid and lyophilization), full-service ophthalmic offer (liquids, ointments & creams) and ampoules. The CRDMO business of the Company includes the Drug Discovery Services business that provides contract research and development services through two world-class research centres in Bangalore and Noida in India and the CDMO-API business that is involved in the manufacturing of Active Pharmaceutical Ingredients. Jubilant Therapeutics is involved in Proprietary Novel Drugs business and is an innovative biopharmaceutical company developing breakthrough therapies in the area of oncology and autoimmune disorders. The company operates six manufacturing facilities that cater to all the regulated market including USA, Europe and other geographies. The Position Organization- Jubilant Pharmova Limited Designation - Data Analyst Location- Noida Job Summary: We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal incumbent will be responsible for collecting, processing, and analysing large datasets to uncover insights that drive strategic decision-making. You will work closely with cross-functional teams to identify trends, create visualizations, and deliver actionable recommendations that support business goals Key Responsibilities. Drive business excellence by identifying opportunities for process optimization, automation, and standardization through data insights. Design, develop, and maintain robust ETL pipelines and SQL queries to ingest, transform, and load data from diverse sources. Build and maintain Excel-based dashboards, models, and reports; automate repetitive tasks using Excel macros, Power Query, or scripting tools. Ensure data quality, integrity, and consistency through profiling, cleansing, validation, and regular monitoring. Translate business questions into analytical problems and deliver actionable insights using statistical techniques and data visualization tools. Collaborate with cross-functional teams (e.g., marketing, finance, operations) to define data requirements and address business challenges. Develop and implement efficient data collection strategies and systems to optimize accuracy and performance. Monitor and troubleshoot data workflows, resolving issues and ensuring compliance with data privacy and security regulations. Document data processes, definitions, and business rules to support transparency, reuse, and continuous improvement. Support continuous improvement initiatives by providing data-driven recommendations that enhance operational efficiency and decision-making. Contribute to the development and implementation of best practices in data management, reporting, and analytics aligned with business goals . Person Profile . Qualification - Bachelor s / Master s degree in Computer Science, Information Systems, Statistics, or a related field. Experience 4-6-Years. Desired Certification & Must Have- 4 6 years of experience in data analysis, preferably in the pharmaceutical industry. Advanced proficiency in SQL (joins, CTEs, window functions, optimization) and expert-level Excel skills (pivot tables, advanced formulas, VBA/macros). Strong understanding of data warehousing, relational databases, and ETL tools (e.g., SSIS, Talend, Informatica). Proficiency in data visualization tools (e.g., Power BI, Tableau) and statistical analysis techniques. Solid analytical and problem-solving skills with attention to detail and the ability to manage complex data sets and multiple priorities. Excellent communication and documentation skills to convey insights to technical and non-technical stakeholders. Familiarity with data modelling, database management, and large-scale data manipulation and cleansing. Demonstrated ability to work collaboratively in Agile/Scrum environments and adapt to evolving business needs. Strong focus on process optimization, continuous improvement, and operational efficiency. Experience in implementing best practices for data governance, quality assurance, and compliance. Ability to identify and drive initiatives that enhance business performance through data-driven decision-making. Exposure to business domains such as finance, operations, or marketing analytics with a strategic mindset Jubilant is an equal opportunity employer. . ",

Posted 3 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Job Description Position : Senior Software Engineer / Principal Software Engineer ETL Experience : 4-7 years (Only) Job Description: Designing, developing, and deploying Data Transformation using SQL portion of the data warehousing solution. Definition and implementation of Database development standards, procedures Skills / Competencies : Ability to develop and debug complex and Advance SQL queries, and stored procedures ( Must Have ) Hands on experience in Snowflake (Must Have) Hands-on experience in either one or more of the ETL tools like Talend, Informatica (good to have) Hands on experience on any one streaming tool like DMS, Qlik, Golden gate, IICS, Open Flow Hands on experience using snowflake and Postgres databases Database optimization experience would be an added advantage. (good to have) Excellent design, coding, testing, and debugging skills. Should have experience in AGILE methodologies, experience in custom facing will be an added advantage. (good to have) Automation using phyton, java or any other tool will be an added advantage. (good to have)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Location: Bengaluru Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Enterprise technology has to do much more than keep the wheels turning; it is the engine that drives functional excellence and the enabler of innovation and long-term growth . Learn more about ET&P Your work profile As Consultant in our Oracle Team you ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor s or Master s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs. Location and way of working Base location: Bengaluru This profile involves occasional travelling to client locations OR this profile does not involve extensive travel for work. Hybrid is our default way of working. Each domain has customised the hybrid approach to their unique needs. Your role as a Consultant We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyones valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Location: Bengaluru Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Enterprise technology has to do much more than keep the wheels turning; it is the engine that drives functional excellence and the enabler of innovation and long-term growth . Learn more about ET&P Your work profile As Senior Consultant in our Oracle Team you ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor s or Master s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs. Location and way of working Base location: Bengaluru This profile involves occasional travelling to client locations OR this profile does not involve extensive travel for work. Hybrid is our default way of working. Each domain has customised the hybrid approach to their unique needs. Your role as a Senior Consultant We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyones valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Location: Bengaluru Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Enterprise technology has to do much more than keep the wheels turning; it is the engine that drives functional excellence and the enabler of innovation and long-term growth . Learn more about ET&P Your work profile As Senior Consultant in our Oracle Team you ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor s or Master s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs. Location and way of working Base location: Bengaluru This profile involves occasional travelling to client locations OR this profile does not involve extensive travel for work. Hybrid is our default way of working. Each domain has customised the hybrid approach to their unique needs. Your role as a Senior Consultant We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyones valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 3 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Chennai

Work from Office

Req ID: 333121 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer to join our team in Chennai, Tamil N du (IN-TN), India (IN). Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer Java Full Stack Engineer 3 (6-9 Years) Mandatory Skills Hands on experience in Java, Spring, Springboot, Event/Listener messaging frameworks like Kafka Hands on experience in Designing and Developing Robust RESTful API and Micro services. Hands on experience in Hashicorp Vault, Terraform and Packer Hands on experience in Kubernetes tools and services, including managed Kubernetes platforms, service meshes, monitoring solutions, and security tools In-depth understanding on API Management Stratum/Apigee Proven experience in designing, deploying, and maintaining cloud infrastructure across platforms like AWS, Azure, or Google Cloud. preferably Azure Namespace, AKS, ASB, Data Factory, API Management, Storage Account, and Redis. Knowledge on CD process and tools, testing frameworks and practices (GitHub, Jenkins, uDeploy, Stash) Good to have skills to this Role Knowledge in Control M, DB2 to CICS, Cloud to CICS and MAUI Detailed JD: The Expertise You Have Bachelor s degree in computer science, Engineering or Equivalent. You have hands-on experience in building the interconnected systems that enable a business to operate, including hardware, software, network and database. Very Strong expertise in updating and maintaining legacy systems to leverage modern technologies and architectures. You have the expertise and experience in designing and developing microservices which can handle high Transaction Per Second traffic. Strong understanding of data governance principles and best practices. You are experienced with a variety of modern programming languages and frameworks. 8+ years of experience working with Java, Springboot, Oracle, Kubernetes, Kafka, Azure/AWS cloud technologies. You have a passion for technology and can stay on top of latest technology trends. Good working knowledge on ITIL processes like Incident management, Change management etc., You have hands-on experience leading or mentoring scrum teams focused on building software solutions for business critical, architecturally distributed experiences. The teams you have worked with have multi-functional responsibilities such as engineering, quality, devops and release implementation. You care about cycle time and use CI/CD practices, tools to rapidly deploy changes to production while minimizing risk. Have strong communication skills and technical expertise to drive and participate in meaningful discussions with partners across different roles and different skillsets. The Skills that are Key to This Role Hands on experience in Java, Spring, Springboot, Event/Listener messaging frameworks Hands on experience in Designing and Developing Robust RESTful API Hands on experience in Hashicorp Vault, Terraform and Packer Hands on experience in Kubernetes tools and services, including managed Kubernetes platforms, service meshes, monitoring solutions, and security tools In-depth understanding on API Management Stratum/Apigee Proven experience in designing, deploying, and maintaining cloud infrastructure across platforms like AWS, Azure, or Google Cloud. preferably Azure Namespace, AKS, ASB, Data Factory, API Management, Storage Account, and Redis. Hands on experience in container-based development (Docker) Hands on experience working with EDA solutions such as Kafka/ MQ Hands on experience working with database and data concepts, tools and technologies (Oracle, PL/SQL Informatica) Familiarity working with OAuth 2.0 framework and scopes Experience in implementing Micro services Architecture & building / deploying highly automated, scalable and maintainable infrastructure. Experience in designing and developing apps with high throughput and low latency utilizing load balancing, caching, threading etc. Knowledge on CD process and tools, testing frameworks and practices (GitHub, Jenkins, uDeploy, Stash) Strategic thinking and critical problem-solving skills General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 #LI-INPAS

Posted 3 weeks ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Chennai

Work from Office

The Senior Technology Consultant team will be responsible for understanding Kinaxis customers most pressing business performance challenges and will be committed to helping our customers solve complex issues in their supply chain management practice. The incumbent will work with new and existing customers and provide expert guidance in integrating Kinaxis Maestro solution with existing client enterprise systems so that our customers can start to experience immediate value from the product. What you will do Perform integration configuration mapping, loading, transforming and validating data required to support our customer s unique system landscape on moderate to complex projects. Design customized technology solutions to address specific business challenges or opportunities, considering the customer s technological ecosystem and based on the integration approach (Kinaxis-led vs. customer-led). Assist with the implementation and deployment of technology solutions, including project management, system integration, configuration, testing, and training. Demonstrate knowledge and deep proficiency in both the Kinaxis Integration Platform Suite, Maestro data model, REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications. Drive data management and integration related activities including validation and testing of the solutions. Support deployment workshops to help customers achieve immediate value from their investment. Act as the point person for Kinaxis-led integrations and coach and guide more junior and/or offshore consultants through the tactical deliverables for data integration requirements, ensuring a smooth delivery of the end solution. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Strong integration knowledge especially in extracting and transforming data from enterprise class ERP systems like SAP, Oracle, etc. Experience with ERP solutions such as SAP, Oracle, Infor, MS Dynamics etc. Hands on experience and expertise with ETL tools such as Talend, Informatica, SAP CPI / SAP BTP, OIC, MuleSoft, Apache Hop etc. Technical skills such as SQL, JAVA, JavaScript, Python, etc. Strong understanding of data modelling. Knowledge of Cloud Service Providers like GCP, Azure, AWS and their offerings is an advantage. Experience with configuration of data integration from / to SAP through BAPI / RFC, ABAP Programs, CDS Views, or ODATA is an advantage. What we are looking for Bachelor s degree in Computer Science, Information Technology, AI/ML or a related field. 8-12 years of relevant experience in business software consulting, ideally in supply chain. Minimum 6 years of experience in data integration across complex enterprise systems. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Excellent communication, presentation, facilitation, time management, and customer relationship skills. Excellent problem solving and critical thinking skills. Ability to work virtually and plan for up to 50% travel.

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

5 - 7 Lacs

Cochin

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your key responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills and attributes for success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What we look for Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 2 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: • Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. • Develop and optimize data workflows using PySpark, SQL, and Airflow. • Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. • Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. • Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. • Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: • Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy • Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks • Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) • Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred • ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts • Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling • Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: • Strong problem-solving and analytical skills • Flexible to work on fast-paced and cross-functional priorities • Experience collaborating with AI/ML or GenAI teams is a plus • Good communication and a collaborative, team-first mindset • Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

2 - 5 Lacs

Cochin

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica MDM Specialist-Senior Job Summary: We are looking for a skilled Informatica MDM Specialist . The candidate will have hands-on experience implementing and maintaining Master Data Management solutions using Informatica MDM (Customer 360, Supplier 360, Product 360 and MDM Hub) . This role involves architecting and developing MDM solutions, managing data quality, and ensuring data governance practices across the enterprise. Key Responsibilities: Design, develop, and implement end-to-end MDM solutions using Informatica MDM platform. Configure Data Models, Match & Merge rules, Hierarchies, Trust Framework , and workflows. Collaborate with business stakeholders, data architects, and developers to gather and analyse requirements. Perform data profiling, cleansing, standardization, and validation for master data domains. Implement data governance and stewardship workflows for maintaining data quality. Monitor MDM performance, manage error handling and system tuning. Prepare and maintain technical documentation, deployment guides , and support materials. Provide technical support and troubleshooting during and post-deployment. Stay up to date with Informatica MDM product updates, industry trends, and best practices. Required Qualifications: 3-7 years of experience in Informatica MDM development and implementation . Strong understanding of MDM architecture, data modelling, and metadata management . Hands-on experience with Informatica MDM Hub, e360, IDD, SIF, MDM Provisioning Tool , and ETL/ELT . Experience with data quality tools (Informatica DQ or others) and MDM integration patterns . Understanding of data governance principles and master data domains (customer, product, vendor, etc.). Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Preferred Qualifications: Informatica MDM certification(s). Experience with IDMC MDM – MDM SaaS Familiarity with data governance platforms (e.g., Collibra, Informatica Axon). Exposure to Agile/Scrum delivery methodologies. Experience in large-scale MDM implementations in domains like Retail, Manufacturing, Healthcare, or BFSI. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Telangana

On-site

Bachelor’s degree in computer science or similar field or equivalent work experience. 4+ years of hands-on Software/Application development experience with Informatica PowerCenter and PowerBI (visualization and modeling). Extensive working knowledge of PowerBI large data sets and modelling Extensive knowledge in DAX coding Experience in Performance analysis and tuning and Knowledge in troubleshooting tools like Tabular editor, DAX studio Experience in Incremental, Hybrid data refreshing methods Knowledge around PowerBI service and capacity managements In-depth knowledge of data warehousing and worked in star schema concepts like facts/dimension tables. Strong PowerBI modeling and data visualization experience in delivering projects. Strong in SQL and PL-SQL Strong Data Warehousing & Database fundamentals in MS SQL server DB. Strong in performance testing and troubleshooting of application issues using informatica logs.

Posted 3 weeks ago

Apply

4.0 years

12 Lacs

Hyderābād

On-site

Overall experience of 4 to 8 years in the Information Technology. 3+ years’ experience working on Hyperion Essbase & Planning cloud along with root loads and metaroot load automation. Should have experience on Calc Script, Report scripts, Rules files and application optimization. 3+ years’ in developing root warehousing applications using ODI or any Enterprise ETL Tools like DataStage Informatica in extracting, ingesting, processing of large root sets by building root pipelines. 2+ years’ of experience in migration process, Lifecycle Management and Hyperion Financial Reporting. 3+ years’ Working experience of Data warehousing, Data modelling, Governance and Data Architecture in designing root base objects. Good to have knowledge on General Ledger, forecast & budgeting cycles. Good to have exposure on other similar tools like Anaplan. Exposure & experience with cloud root-based root management practices especially OCI / Azure. Experience in building modern dimensional models to improve accessibility, efficiency, and quality of root. Experience on Hyperion security management HSS and migration Exposure to automation efforts using Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins, Codefresh etc. Experience working in Agile and Scrum development process. Job Type: Full-time Pay: ₹1,200,000.00 per year Schedule: Day shift Work Location: In person

Posted 3 weeks ago

Apply

4.0 years

2 - 6 Lacs

Gurgaon

On-site

At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are—with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Job Title: Software Engineer – Salesforce Location: Gurgaon, Haryana, India Department: Customer, Operations & Risk (COR), Moody’s Analytics Reporting Manager: Manav Vatsyayana Employment Type: Full-Time About the Role We are looking to bring on board a skilled and motivated Software Engineer to join our Production Support team within Moody’s Analytics. This role is critical to ensuring the stability, performance, and continuous improvement of our Salesforce platform and its integrations with enterprise systems. You will be part of a dynamic team that supports global users and collaborates closely with cross-functional stakeholders, vendors, and agile teams. If you are passionate about Salesforce technologies, thrive in high-availability environments, and enjoy solving complex problems, I’d love to hear from you. Key Responsibilities Provide daily production support for Salesforce applications, ensuring timely resolution of incidents and service requests. Lead and manage ticket inflow, task assignments, and daily reporting. Collaborate with L1 business leads to prioritize tasks and ensure alignment with business needs. Drive root cause analysis and resolution of integrated data issues across platforms. Oversee release management and operational support activities. Design and implement automation for build, release, and deployment processes. Support deployment of new features and configuration changes using DevOps tools. Communicate incident and request statuses to stakeholders, including senior leadership. Participate in project transitions, UAT, and knowledge transfer activities. Act as Duty Manager on a rotational basis, including weekends, for major incident management (if required). Participate in team meetings, document procedures, and ensure service level targets are met. Required Skills and Competencies Salesforce certifications: Administrator , Platform App Builder , and Platform Developer I . Apttus CPQ and FinancialForce certifications are a plus. Strong understanding of ITIL disciplines: Event, Incident, Request, Problem, Release, and Knowledge Management. Experience with data quality tools and techniques (e.g., SQL/SOQL for profiling, validation, cleansing). Proficiency in DevOps tools: GitHub and Jira or other similar tools like Bitbuket, AutoRabit, SVN, Aldon, TFS, Jenkins, Urban Code, Nolio and Puppet. Experience supporting Salesforce applications and ERP/data integration tools (e.g., SAP, MuleSoft, Informatica, IBM SPM). Strong analytical and problem-solving skills with attention to detail. Ability to manage competing priorities in a fast-paced, Agile environment. Excellent communication and interpersonal skills. Proficiency in reporting and analysis tools (e.g., Excel, PowerPoint). Familiarity with workload automation and monitoring tools such as BMC Remedy, Control-M, Tivoli, Nagios, and Splunk is advantageous. Education and Experience Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. Minimum 4 years of experience in software development, DevOps, and production support, preferably within the financial services sector. About the Team You’ll be joining the Production Support Team under the Business Systems group in the Customer, Operations & Risk business unit. Our team supports Moody’s Analytics employees globally who rely on the Salesforce CRM platform. This is an exciting opportunity to work on cutting-edge Salesforce technologies and contribute to a high-impact support function. Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law. Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

About Kinaxis: About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® CertifiedTM. Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. About the team: Location Chennai, India About the Team The Senior Technology Consultant team will be responsible for understanding Kinaxis customers’ most pressing business performance challenges and will be committed to helping our customers solve complex issues in their supply chain management practice. The incumbent will work with new and existing customers and provide expert guidance in integrating Kinaxis’ Maestro solution with existing client enterprise systems so that our customers can start to experience immediate value from the product. About the role: What you will do Perform integration configuration – mapping, loading, transforming and validating data required to support our customer’s unique system landscape on moderate to complex projects. Design customized technology solutions to address specific business challenges or opportunities, considering the customer’s technological ecosystem and based on the integration approach (Kinaxis-led vs. customer-led). Assist with the implementation and deployment of technology solutions, including project management, system integration, configuration, testing, and training. Demonstrate knowledge and deep proficiency in both the Kinaxis Integration Platform Suite, Maestro data model, REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications. Drive data management and integration related activities including validation and testing of the solutions. Support deployment workshops to help customers achieve immediate value from their investment. Act as the point person for Kinaxis-led integrations and coach and guide more junior and/or offshore consultants through the tactical deliverables for data integration requirements, ensuring a smooth delivery of the end solution. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Strong integration knowledge especially in extracting and transforming data from enterprise class ERP systems like SAP, Oracle, etc. Experience with ERP solutions such as SAP, Oracle, Infor, MS Dynamics etc. Hands on experience and expertise with ETL tools such as Talend, Informatica, SAP CPI / SAP BTP, OIC, MuleSoft, Apache Hop etc. Technical skills such as SQL, JAVA, JavaScript, Python, etc. Strong understanding of data modelling. Knowledge of Cloud Service Providers like GCP, Azure, AWS and their offerings is an advantage. Experience with configuration of data integration from / to SAP through BAPI / RFC, ABAP Programs, CDS Views, or ODATA is an advantage. What we are looking for Bachelor’s degree in Computer Science, Information Technology, AI/ML or a related field. 8-12 years of relevant experience in business software consulting, ideally in supply chain. Minimum 6 years of experience in data integration across complex enterprise systems. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Excellent communication, presentation, facilitation, time management, and customer relationship skills. Excellent problem solving and critical thinking skills. Ability to work virtually and plan for up to 50% travel. #Senior #LI-KJ Why join Kinaxis?: Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com. Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions.

Posted 3 weeks ago

Apply

3.0 years

3 - 4 Lacs

Noida

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Ignition Application Administrator Position: We are seeking a highly motivated Ignition Application Administrator to join the Enterprise Services – Data team. Working very closely with peer platform administrators, developers, Product/Project Seniors and Customers, you will play an active role in administering the existing analytics platforms. You will join a team of platform administrators who are specialized in one tool, but cross-trained on other tools. While you will focus on Ignition, administration knowledge of these other platforms is beneficial – Qlik Sense, Tableau, PowerBI, SAP Business Objects, Matillion, Snowflake, Informatica (EDC, IDQ, Axon), Alteryx, HVR or Databricks. This role requires a willingness to dive into complex problems to help the team find elegant solutions. How you communicate and approach problems is important to us. We are looking for team players, who are willing to bring people across the disciplines together. This position will provide the unique opportunity to operate in a start-up-like environment within a Fortune 50 company. Our digital focus is geared towards releasing the insights inherent to our best-in-class products and services. Together we aim to achieve new levels of productivity by changing the way we work and identifying new sources of growth for our customers. Responsibilities include, but are not limited to, the following: Install and configure Ignition. Monitor the Ignition platform, including integration with observability and alerting solutions, and recommend platform improvements. Troubleshoot and resolve Ignition platform issues. Configure data source connections and manage asset libraries. Identify and raise system capacity related issues (storage, licenses, performance threshold). Define best practices for Ignition deployment. Integrate Ignition with other ES Data platforms and Business Unit installations of Ignition. Participate in overall data platform architecture and strategy. Research and recommend alternative actions for problem resolution based on best practices and application functionality with minimal direction. Knowledge and Skills: 3+ years working in customer success or in a customer-facing engineering capacity is required. Large scale implementation experience with complex solutions environment. Experience in customer-facing positions, preferably industry experience in technology-based solutions. Experience being able to navigate, escalate and lead efforts on complex customer/partner requests or projects. Experience with Linux command line. An aptitude for both analysing technical concepts and translating them into business terms, as well as for mapping business requirements into technical features. Knowledge of the software development process and of software design methodologies helpful 3+ years’ experience in a cloud ops / Kubernetes application deployment and management role, working with an enterprise software or data product. Experience with Attribute-based Access Control (ABAC), Virtual Director Services (VDS), PING Federate or Azure Active Directory (AAD) helpful. Cloud platform architecture, administration and programming experience desired. Experience with Helm, Argo CD, Docker, and cloud networking. Excellent communication skills: interpersonal, written, and verbal. Education and Work Experience: This position requires a minimum A BA/BS Degree (or equivalent) in technology, computing or other related field of study. Experience in lieu of education may be considered if the individual has ten (3+) or more years of relevant experience. Hours: Normal work schedule hours may vary, Monday through Friday. May be required to work flexible hours and/or weekends, as needed, to meet deadlines or to fulfil application administration obligations. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

7.0 years

4 - 6 Lacs

Noida

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies