Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF,Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Exciting Opportunity: Compliance Officer - KEB Hana Bank (New Devanahalli Branch!) KEB Hana Bank is a leading global financial institution with a rich history and a strong presence across Asia and worldwide. We are committed to providing innovative financial solutions and superior customer service. As part of our continued growth and commitment to the Indian market, we are thrilled to announce the upcoming launch of our new branch in Devanahalli, Bangalore , set to open its doors in October 2025 ! We are seeking a diligent and experienced Compliance Officer to establish and oversee the compliance framework for our new Devanahalli team. This is a pivotal role in ensuring our operations adhere to all regulatory requirements and internal policies from day one. If you have 5+ years of robust compliance experience within the banking sector, a deep understanding of RBI regulations, and are willing to commute to Devanahalli , we want to hear from you! Your Key Responsibilities will include: Regulatory Adherence & Reporting: Manage and maintain the RBI ADF/reporting system for timely and accurate submission of regulatory returns. Act as the Money Laundering Reporting Officer (MLRO), performing all associated duties to ensure adherence with RBI rules and regulatory bodies. Liaise effectively with the RBI, FIU, and other regulatory bodies to ensure compliance. Authorize and release payment orders filtered by the OFAC Filtering System. Monitor internal control processes and submit Monthly Compliance Reports to Head Office (H.O.). Internal Audit & Controls: Act in the capacity of Internal Auditor, ensuring regular audits are performed across all branch departments. Manage daily and monthly internal audits. Coordinate and manage audits initiated by Head Office. Manage responses and follow-ups related to external bank audits. Policy, Procedure & Advisory: Work closely with the Chief Executive Officer (CEO) in overseeing compliance procedures and advising on risk management. Assist the CEO with developing the entity-wide budget for compliance efforts, identifying resource gaps. Create, review, and update internal processes and manuals according to KEB Hana Bank policy and regulatory changes. Preview and assess new/renewal of contracts, proposals for new banking products/services, and submissions of bank data to external parties. Training & Planning: Develop and deliver training for all staff on internal controls and Anti-Money Laundering (AML) procedures, reporting completion to H.O. Establish, execute, and report results of the yearly Compliance Plan to H.O. Stakeholder Management: Provide managers of other teams with appropriate and up-to-date compliance information or data promptly. Manage and maintain strong, cooperative relationships with regulators. What We're Looking For: Minimum 5+ years of progressive experience in a banking compliance role. In-depth knowledge of RBI regulations, AML/KYC laws, OFAC, and other relevant Indian banking compliance frameworks. Proven experience as an MLRO or in a similar capacity. Strong experience with regulatory reporting systems (e.g., RBI ADF). Experience in developing and implementing compliance policies, procedures, and training programs. Demonstrable experience in conducting and managing internal audits. Excellent analytical, problem-solving, and decision-making skills. Strong communication, interpersonal, and liaison skills for effective interaction with regulators and internal teams. Meticulous attention to detail. Crucially, a willingness and ability to commute to our new Devanahalli branch. Relevant professional certifications (e.g., CAMS, IIBF certification in Compliance/AML) would be a significant advantage. About KEB Hana Bank: KEB Hana Bank is a premier global financial group headquartered in South Korea, with an extensive network spanning numerous countries. We pride ourselves on our customer-centric approach, commitment to innovation, and a legacy of trust built over decades. Our expansion into Devanahalli signifies our dedication to serving the Indian market and contributing to its vibrant economy. Join us as we embark on this exciting new chapter! Why Join KEB Hana Bank? Be a pioneering member of a new branch for a globally recognized bank. Opportunity to establish and shape the compliance culture from the ground up. Competitive salary and benefits package. A dynamic and supportive work environment with opportunities for growth. Ready to make your mark? If you are a proactive and experienced compliance professional ready for a challenging and rewarding role, we encourage you to apply! You can apply via the "Apply" button on LinkedIn OR send your resume directly to job.hanabank@gmail.com with the subject line "Application for Compliance Officer - Devanahalli." We look forward to reviewing your application! #ComplianceOfficer #BankingCompliance #AML #KYC #RBI #RegulatoryAffairs #RiskManagement #FinanceJobs #BankingJobs #KEBHanaBank #Devanahalli #BangaloreJobs #JobOpening #Hiring #NewBranch #KarnatakaJobs Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
Teamified works with top enterprises and digital native businesses in Australia helping them build their remote teams in India, Philippines and Sri Lanka. We pride ourselves with hiring great teams to work on exciting game changing technology. Teamified currently has 200+ engineers, testers, product managers etc. working across 20+ partners. We focus on uplifting the way organisations build and manage their remote teams through great working relationships, trust, integrity, culture and hiring processes. Additional to this we are building our own technology product offerings. We strive to deliver the best outcomes for our customers, our partners and our people. Key Responsibilities: Data Collection and Interpretation: Gather and interpret data from various sources to identify relevant information and insights Pattern and Trend Identification: Analyze data sets to identify patterns and trends, enabling informed decision-making and forecasting Insight Derivation: Derive actionable insights and forecasts from data analysis, supporting business objectives and strategies Data Quality Assessment: Assess data quality and implement remediation measures to ensure accuracy and reliability Data Mapping: Map data from different sources to facilitate integration and compatibility Report Creation and Data Visualization: Create comprehensive reports and visualize data using reporting tools to effectively communicate findings and insights Data Pipeline Management: Build and maintain data pipelines to streamline the flow of information and automate processes Process Enhancement: Identify opportunities for process enhancements and contribute to their implementation to improve efficiency and effectiveness. Key Requirements: Proficiency in analyzing large data sets and writing comprehensive reports Excellent analytical skills with the ability to identify trends, patterns, and insights from data Strong attention to detail and methodical mindset Strong problem-solving skills Intermediate understanding of databases and data models Hands-on experience with SQL database design Experience designing, developing, and publishing data ETL pipelines Exposure to Azure Data Factory (ADF) and Azure Storage Team skills and collaboration. Desirable Skills: Understanding of reporting and data visualization tools such as Power BI Excellent communication skills Education in Mathematics, Computer Science, or Statistics Knowledge of programming languages such as Python or C#. Benefits Flexibility in work hours, with a focus on managing energy rather than time Access to online learning platforms and a budget for professional development A collaborative, no-silos environment, encouraging learning and growth across teams A dynamic social culture with team lunches, social events, and opportunities for creative input Private Health insurance If you possess these skills and are passionate about leveraging data to drive insights and business outcomes, we encourage you to apply for the role of Data Engineer. Join us in our mission to unlock the power of data for informed decision-making and business success. Apply now! Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
As an Azure Data Engineer, your mission will be to lead the design and development of data ingestion processes into our Azure Databricks environment , supporting business workflows and data-sharing initiatives through modern APIs. Responsibilities Design, build, and maintain scalable data pipelines in Azure. Develop and manage ETL processes to ingest, transform, and serve clean data. Ensure data integrity through validation, cleansing, and quality controls. Monitor and troubleshoot data pipeline performance and errors. Collaborate with cross-functional teams (analysts, scientists, app developers). Write and optimize SQL and Python/Spark code for large-scale data handling. Drive data governance practices including cataloging, lineage, and metadata. Implement data security/privacy in alignment with policy and compliance. Stay up to date with Azure data engineering tools and industry best practices. Requirements 5+ years of hands-on experience with Azure data engineering. Deep expertise in Azure Data Factory, Databricks, Synapse, Azure SQL, Logic Apps , and Azure Functions . Strong programming skills in Python and PySpark . Experience with Dremio , Kafka , or Snowflake is a big plus. Bonus: Experience extracting data from SharePoint Online . Hands-on experience with structured & unstructured data processing. DP-203 certification or equivalent Azure training is highly valued. Strong communication skills—able to work with technical and business teams alike. 🔗 Apply now or share your profile with us at info@papigen.com #AzureDataEngineer #Databricks #DataPipelines #AzureJobs #ETL #ADF #Synapse #DataEngineerJobs #RemoteJobs Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job title: Data Engineer About The Role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required: 2-4 years Education Qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills AWS Glue, Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation, Data Warehouse, Data Warehouse Indexing {+ 13 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job title: Data Engineer About The Role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required: 2-4 years Education Qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills AWS Glue, Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation, Data Warehouse, Data Warehouse Indexing {+ 13 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
13.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Director Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC About The Role As a Director, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required 13+ years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation {+ 30 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Up to 60% Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities About the role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required 4-7years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Artificial Intelligence, Big Data, C++ Programming Language, Coaching and Feedback, Communication, Complex Data Analysis, Creativity, Data-Driven Decision Making (DIDM), Data Engineering, Data Lake, Data Mining, Data Modeling, Data Pipeline, Data Quality, Data Science, Data Science Algorithms, Data Science Troubleshooting, Data Science Workflows, Deep Learning, Embracing Change, Emotional Regulation {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
9.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Markovate. At Markovate, we dont just follow trendswe drive them. We transform businesses through innovative AI and digital solutions that turn vision into reality. Our team harnesses breakthrough technologies to craft bespoke strategies that align seamlessly with our clients' ambitions. From AI Consulting And Gen AI Development To Pioneering AI Agents And Agentic AI, We Empower Our Partners To Lead Their Industries With Forward-thinking Precision And Unmatched Overview We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modelling. Requirements This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault Requirements : 9+ years of experience in data engineering and data architecture. Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness. Must be highly collaborative and team oriented with commitment to Responsibilities : Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze ? silver ? gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and Open Metadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions, Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modelling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great to have: Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in SQL and at least one programming language (e.g., Python, it's like to be at Markovate : At Markovate, we thrive on collaboration and embrace every innovative idea. We invest in continuous learning to keep our team ahead in the AI/ML landscape. Transparent communication is keyevery voice at Markovate is valued. Our agile, data-driven approach transforms challenges into opportunities. We offer flexible work arrangements that empower creativity and balance. Recognition is part of our DNAyour achievements drive our success. Markovate is committed to sustainable practices and positive community impact. Our people-first culture means your growth and well-being are central to our mission. Location : hybrid model 2 days onsite. (ref:hirist.tech) Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
What You'll Do Data Analytics & Modeling : Apply strong Data Analytics and Analytical Skills to understand complex business requirements and translate them into effective Data Modeling solutions. Data Pipeline Development : Design, develop, and maintain robust ETL (Extract, Transform, Load) pipelines using Azure Data Engineering services to ingest, process, and transform large datasets. Data Warehousing : Build and optimize fact and dimension tables within analytical databases, contributing to scalable data warehousing solutions. Azure Data Engineering : Hands-on development using Azure Data Engineering tools such as Azure Data Factory (ADF), Databricks, and Fabric. Programming for Data : Utilize expertise in PySpark and Python to develop and maintain efficient data processing solutions, ensuring data integrity, performance, and scalability. BI Dashboarding : Develop and maintain compelling Business Intelligence (BI) dashboards using tools like PowerBI and/or Tableau, turning raw data into actionable insights. Analytical Databases : Work with analytical databases such as Snowflake, Azure Synapse, and others to store and process large volumes of data. SQL & Programming : Demonstrate proficiency in SQL for data manipulation and querying, and possess skills in other relevant programming languages. Performance & Integrity : Ensure data integrity, quality, and optimal performance of data pipelines and BI solutions. Collaboration : Collaborate effectively with data scientists, business analysts, product managers, and other engineering teams to understand data needs and deliver comprehensive Skills & Qualifications : Experience : Minimum 2 years of core, hands-on experience in Azure Data Engineering and Business Intelligence (PowerBI and/or Tableau). Data Fundamentals : Strong understanding of Data Analytics, Analytical Skills, Data Analysis, Data Management, and Data Modeling concepts. Azure Data Engineering : Mandatory hands-on experience with Azure Data Engineering services including ADF, Databricks, and Fabric. Programming for Data : Proficiency in PySpark and Python, with a proven ability to develop and maintain robust data processing solutions. SQL Expertise : Strong proficiency in SQL for complex data querying and manipulation. BI Tools : Practical experience building BI dashboards using PowerBI and/or Tableau. Analytical Databases : Experience with analytical databases like Snowflake, Azure Synapse, etc. Problem Solving : Strong problem-solving and critical thinking abilities to tackle complex data challenges. Education : Bachelor's degree in Computer Science, Information Systems, or a related Qualifications (Nice-to-Have) : Relevant Microsoft Azure certifications (e.g., Azure Data Engineer Associate, Azure Data Analyst Associate). Experience with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs). Familiarity with Data Governance and Data Quality frameworks. Exposure to MLOps concepts and tools. (ref:hirist.tech) Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are looking for a Data Engineer with solid hands-on experience in Azure-based data pipelines and Snowflake to help build and scale data ingestion, transformation, and integration processes in a cloud-native Responsibilities : Develop and maintain data pipelines using ADF, Snowflake, and Azure Storage Perform data integration from various sources including APIs, flat files, and databases Write clean, optimized SQL and support data modeling efforts in Snowflake Monitor and troubleshoot pipeline issues and data quality concerns Contribute to documentation and promote best practices across the : 3- 5 years of experience in data engineering or related role Strong hands-on knowledge of Snowflake, Azure Data Factory, SQL, and Azure Data Lake Proficient in scripting (Python preferred) for data manipulation and automation Understanding of data warehousing concepts and ETL/ELT patterns Experience with Git, JIRA, and agile delivery environments is a plus Strong attention to detail and eagerness to learn in a collaborative team setting (ref:hirist.tech) Show more Show less
Posted 1 month ago
6.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Experience 6-9 Years Job Description Power BI Developer Azure ADF Work on Power BI reports - Develop new reports or /fix any data issues in existing reports and support users for any data validation. Support the Data team to understand the functional requirements. Strong experience in SQL & writing complex DAX queries. Understand the existing report requirements & capture new report specifications. Coordinate amongst various groups in understanding Report KPI’s Participating in the data requirement sessions and develop Power BI reports. Provide the solutioning and design the prototype for use case reports. Specialized in different reporting tools. Responsible for report feature assessment and building report matrix. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud – Technology Assurance As Risk Assurance Senior, you’ll contribute technically to Risk Assurance client engagements and internal projects. An important part of your role will be to assist fellow Seniors & Managers while actively participating within the client engagement Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. In line with EY commitment to quality, you’ll confirm that work is of high quality and is reviewed by the next-level reviewer. As a member of the team, you’ll help to create a positive learning culture and assist fellow team members while delivering an assignment. The opportunity We’re looking for professional having at least 3 years or more of experience. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Skills and Summary of Accountabilities: Designing, architecting, and developing solutions leveraging Azure cloud to ingest, process and analyse large, disparate data sets to exceed business requirements. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics and Azure Data Lake Storage for data storage and processing. Designed data pipelines using these technologies. Working knowledge of Data warehousing/Modelling ETL/ELT pipelines, Data Democratization using cloud services. Design, build and maintain efficient, reusable, and reliable code ensuring the best possible performance, quality, and responsiveness of applications using reliable Python code. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse studia, ADF etc Exposure working in client facing roles, collaborate with cross functional teams including internal audits, IT security and business stakeholders to assess control effectiveness and facilitate remediation activities. Preferred knowledge/understanding of experience in IT Controls, Risk and Compliance. Design IT Risk Controls framework such as IT SOX. Testing of internal controls such as IT general controls, IT application controls, IPE related controls, interface controls etc. To qualify for the role, you must have. 3 years of experience in building end-to-end business solutions using big data and data engineering. Expertise in core Microsoft Azure Data Services (e.g., Azure Data Factory, Azure Databricks, Azure Synapse, Azure SQL, Data Lake services etc). Familiar with integrating services, Azure Logic apps, Function Apps, Stream analytics, Triggers, Event hubs etc. Expertise in Cloud related Big Data integration and infrastructure tech stack using Azure Databricks & Apache Spark framework. Must have the following: Python, SQL and preferred R, Scala. Experience developing software tools using utilities, pandas, NumPy and other libraries/components etc. Hands-on expertise in using Python frameworks (like Django, Pyramid, Flask). Preferred substantial background in data extraction and transformation, developing data pipelines using MS SSIS, Informatica, Talend or any other on-premises tools. Preferred knowledge on Power BI or any BI tools. Should have good understanding of version controlling with Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, data analytics or related disciplines. Experience with AI/ML is a plus. Preferred Certification in DP-203 Azure Data Engineer or any other. Ability to communicate clearly and concisely and use strong writing and verbal skills to communicate facts, figures, and ideas to others. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Senior consulting position operating independently with some assistance and mentorship to a project team or customer align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Career Level - IC2 Responsibilities Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Detailed Description Operates independently to provide quality work products to an engagement. Performs multifaceted and complex tasks that need independent judgment. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver solutions on complex engagements. May act as the functional team lead on projects. Efficiently collaborates with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for complex projects. Detail Requirements: The candidate is expected to have a sound domain knowledge in HCM covering the hire to retire cycle with 7 to 12 years experience. They must have been a part of at least 3 end to end HCM Cloud implementations along with experience in at least 1 projects as a lead. FUNCTIONAL - The candidate must have knowledge in any of the modules along with Core HR module -Time and Labor Absence Management Payroll Benefits Compensation Recruiting The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Engineering graduates with MBA (HR) will be preferred. TECHNICAL - In-depth understanding of Data Model and Business process functionality and its data flow) in HCM Cloud application and Oracle EBS / PeopleSoft AU (HRMS). Experienced knowledge on Cloud HCM Conversions, integrations (HCM Extracts & BIP), Reporting (OTBI & BIP), Fast Formula & Personalization. Engineering Graduation in any field or MCA Degree or equivalent experience. Proven experience with Fusion technologies including HDL, HCM Extracts, Fast Formulas, BI Publisher Reports & Design Studio. Apart from the above experience, advanced knowledge in OIC, ADF, Java, PaaS, DBCS etc would be an added advantage. Good functional or technical leadership capability with strong planning and follow up skills, mentorship, Work Allocation, Monitoring and status updates to Project Coordinator Should have strong written and verbal communication skills, personal drive, flexibility, teammate, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and sharing the knowledge and client management. Assist in the identification, assessment and resolution of complex Technical issues/problems. Interact with client frequently around specific work efforts/deliverable. Candidate should be open for domestic or international travel for short as well as long duration. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Title: Data Engineer - Azure Location: Bengalauru, India - 12 days onsite per month Duration: 6 month extending contract, potential to flip FTE Required Skills & Experience At least 2 year of Development/Support experience in ADF V2 . At least 1 years of Experience with Spark (PySpark and other python data libraries), Spark Scala is also fine. At least 1 years of experience in T-SQL, creating complex store procedures, triggers etc. Should have used Azure DevOps extensively as part of projects. Should have worked on at least 1 project involving hybrid cloud implementation. Should have worked on SQL Database, Azure SQL Database, Azure Datawarehouse and should be comfortable with data warehousing concepts. Should have worked on Azure Logic Apps, Azure Functions for not less than 1 year. 3+ Years professional experience in a data Engineer/analyst role or similar. Knowledge in creating Power BI Reports & Dashboards. Job Description Insight Global is seeking a Data Engineer for a large company based in India. You will join a team in the midst of a data cleansing initiative (Product Data and Sales Orders Data). You will build compelling and clear visualizations of data and design, architect and implement Azure Data Factory V2 Pipelines. You will help the team develop solutions to implement error logging alerting mechanism using various available azure services. Compensation: $25 to $35 LPA Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law. Show more Show less
Posted 1 month ago
5.0 - 10.0 years
10 - 16 Lacs
Gurugram
Hybrid
Strong exp in SQL development along with exp in cloud AWS & good exp in ADF
Posted 1 month ago
5.0 years
0 Lacs
Gurgaon
On-site
Minimum 5 years of experience in SQL development with strong query optimization skill Hands-on experience in designing, developing, and maintaining SQL queries, stored procedures, and database structures Proven expertise in building and managing ETL pipelines using Azure Data Factory (ADF) Experience in integrating data from on-premise and cloud sources using AWS services Knowledge of data warehousing concepts, data modeling, and transformation logic Ability to ensure data accuracy, consistency, and performance across multiple systems Strong troubleshooting skills for resolving data and performance-related issues Familiarity with cloud-based data architecture and modern data engineering best practices Effective communication and collaboration with cross-functional teams Ability to document technical processes and data workflows clearly Location: Gurugram Job type: Hybrid Job Type: Full-time Work Location: In person
Posted 1 month ago
30.0 years
5 - 9 Lacs
Chennai
On-site
Senior Data Engineer – DBT & Snowflake About the Company Systech is a modern Data and Analytics consulting firm, helping clients embed data-driven capabilities into their business operations. We offer end-to-end data engineering services and outcomes-based analytics solutions, to drive your business forward in the digital age. Systech has over 30 years of experience, delivering 1500+ projects to industry-leading brands across the world. Job ID: JD20250527-1123 Job Name: Senior Data Engineer – DBT & Snowflake Years of Experience: 5 No of Openings: 2 Job Description: We are looking for a skilled and experienced DBT-Snowflake Developer to join our team! You will be involved in implementing ongoing and new initiatives. If you love learning, thinking strategically, innovating, and helping others, this role is for you. Primary Skills: DBT, Snowflake Secondary Skills: ADF, Databricks, Python, Airflow, Fivetran, Glue Role Description: This data engineering role includes: Creating and managing the technological infrastructure of a data platform. Architecting, building, and managing data flows/pipelines. Constructing data storage systems (NoSQL, SQL). Utilizing big data tools (Hadoop, Kafka) and integration tools for connecting data sources. Role Responsibilities: Translate functional specs and change requests into technical specs. Convert BRDs and specs into code. Develop efficient, testable, well-documented code. Ensure data/application accuracy and integrity via analysis, coding, and problem-solving. Set up development environments and configure dev tools. Communicate project status to stakeholders. Manage and secure data to meet business needs. Automate processes where needed. Communicate proficiently in English (written, verbal, presentation). Coordinate with the UAT team. Role Requirements: Proficient in basic and advanced SQL (procedures, analytical functions). Strong understanding of data warehousing (dimensional modeling, CDC, SCDs). Knowledge of Shell/PowerShell scripting. Familiar with relational and non-relational databases, data streams, and file stores. Skilled in performance tuning and optimization. Experience in data profiling and validation. Familiar with requirements gathering, documentation, and unit testing. Understanding of QA/testing processes. Knowledge of BI tools is a plus. Logical reasoning and analytical skills. Willingness to learn and take initiative. Comfort in fast-paced Agile environments. Additional Requirements: Design, develop, and maintain scalable data models and transformations using DBT with Snowflake . Effectively transform/load data from diverse sources into data warehouses or lakes. Manage DBT models ensuring accurate, aligned data transformation. Convert raw/unstructured data into structured datasets via DBT. Optimize SQL queries within DBT to boost performance. Establish DBT best practices for performance, scalability, and reliability. Strong SQL skills and a deep understanding of data warehousing and modern data architectures. Familiarity with cloud platforms (AWS, Azure, GCP). Migrate legacy transformation code into modular DBT models. How to apply: Qualified applicants please send resumes to: Systech Solutions, Inc.,
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai
On-site
Job openings for operation engineer in chennai The Operations Engineer will work in collaboration with and under the direction of the Manager of Data Engineering, Advanced Analytics to provide operational services, governance, and incident management solutions for the Analytics team. This includes modifying existing data ingestion workflows, releases to QA and Prod, working closely with cross functional teams and providing production support for daily issues. Essential Job Functions: Takes ownership of customer issues reported and see problems through to resolution Researches, diagnoses, troubleshoots and identifies solutions to resolve customer issues Follows standard procedures for proper escalation of unresolved issues to the appropriate internal teams Provides prompt and accurate feedback to customers Ensures proper recording and closure of all issues Prepares accurate and timely reports Documents knowledge in the form of knowledge base tech notes and articles Other Responsibilities: Be part of on-call rotation Support QA and production releases, off-hours if needed Work with developers to troubleshoot issues Attend daily standups Create and maintain support documentation (Jira/Confluence) Minimum Qualifications and Job Requirements: Proven working experience in enterprise technical support Basic knowledge of systems, utilities, and scripting Strong problem-solving skills Excellent client-facing skills Excellent written and verbal communication skills Experience with Microsoft Azure including Azure Data Factory (ADF), Databricks, ADLS (Gen2) Experience with system administration and SFTP Experience leveraging analytics team tools such as Alteryx or other ETL tools Experience with data visualization software (e.g. Domo, Datorama) Experience with SQL programming Experience automating routine data tasks using various software tools (e.g., Jenkins, Nexus, SonarQube, Rundeck, Task Scheduler) Experience 5 - 10 Years Salary 10 Lac To 20 Lac P.A. Industry Manufacturing / Production / Quality Qualification B.Sc Key Skills Azure Server Adl Executive ETL Tool Data Validation SQL
Posted 1 month ago
5.0 - 9.0 years
22 - 37 Lacs
Gurugram
Work from Office
5+ Years Strong background in SQL development combined with solid experience in AWS cloud services and Azure Data Factory (ADF) for building and managing data pipelines. Hands-on experience with high availability solutions such as Always On Availability Groups, database replication, and log shipping. Expert-level proficiency in T-SQL with hands-on experience in SQL Server 2016 or later, including query optimization and performance tuning. Strong programming skills in Python , particularly for scripting, data manipulation, and automation of data processes. Deep understanding of SQL Server backup and recovery models , with expertise in implementing disaster recovery strategies and best practices. Extensive experience with Microsoft BI tools , including SQL Server Integration Services (SSIS), Reporting Services (SSRS), and Analysis Services (SSAS). Proven ability to work in environments governed by data governance and compliance frameworks , ensuring secure and accurate data handling. Familiarity with Alteryx is an added advantage, though not mandatory.
Posted 1 month ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments Job Purpose: Sr Developer/Tech Lead Location: Trivandrum Duties and Responsibilities/Job Description Collaborate with product owners and architects to understand their business/technical problems and design /architect solutions. Design, develop and deploy cloud-based solutions with well-defined DevOps processes and release strategies. Design and develop microservices-based applications. Design and develop solutions with TDD (Test Driven Development) Work hands-on with engineers to review and troubleshoot coding problems quickly and effectively. Improve the code quality by continuous code reviews. Proactively identify, manage and drive resolution of possible project challenges or constraints. Solve complex performance problems and architectural challenges. Technical documentations Strong understanding of Agile/Scrum methodologies Anchor proof of concept (POC) development to validate proposed solution and reduce technical risks. Skills and Competencies/Job specification Excellent hands-On experience (6+ years) in designing and developing applications using technologies like -C#, .NET Core, Microservices, Web API, Azure- Service Bus, AKS, Function Apps, Azure Data Factory (ADF) – pipelines, data flows, triggers, linked services Strong Understanding and working experience with TDD approach, CI/CD deployment. Working experience with SQL Server, Azure SQL, Cosmos DB, ETL processes, Data Lake, Blob Storage, RESTful APIs, JSON, XML Working experience with Docker, Kubernetes and exposure cloud environment Excellent hands-On knowledge in object-oriented concepts and designing Entity relationships for enterprise applications. Experience in responsive web development. Experience in cross-platform development & architecture· Solid understanding of both the functional and technical specifications, effort estimation and milestones Experience in Retail Domain. Excellent communication skills, oral and written. Optional skill set: Experience in Oracle Fusion Cloud migration, including data extraction, transformation, and integration with enterprise systems. Basic knowledge in Finance and Accounting with migrating enterprise systems, including chart of accounts, sub-ledgers, and financial reporting structures. Skills C# .Net,Microservices,Web Api,Azure Data Factory Show more Show less
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Chandigarh, India
On-site
Job Summary JOB DESCRIPTION We are an equal opportunity employer. We value skills and hiring will be truly merit based. Are you a passionate, tech-savvy, committed person? If yes, then this position is for you to apply on! The position requires interaction with Product Owners, Architects, Business Users and Technical team members for carving out requirements, analysis and provisioning of various IT projects. The ideal candidate for this position should be meticulous, self-motivated, highly innovative and can work in a fast-paced matrixed environment. In this Role, Your Responsibilities Will Be: The consultant needs to collaborate with business product owners & other collaborators to identify, document and implement the business requirements. We need you to maintain, improve and develop customization to existing Oracle Fusion / Oracle Apps modules in an optimized manner. We need you to prepare design documents & participate in design reviews and discussions. Resolve SRs (Service Requests) related to Oracle Apps. We need you to continually show progress on the learning path, build knowledge and proficiency in applications development and changes in technology. Who You Are: You show a tremendous amount of initiative in tough situations; are exceptional at spotting and seizing opportunities. You observe situational and group dynamics and select best-fit approach. You make implementation plans that allocate resources precisely. You pursue everything with energy, drive, and the need to finish. For This Role, You Will Need: Must have 8 to 10 years of experience in Fusion SCM modules & Oracle SCM implementation in Oracle eBusiness suite. Knowledge of Core Oracle Applications tech stack SQL, PL/SQL, OAF, Workflows, APIs, BEPL, Java, ADF, BI Publisher, documenting using OUM methodology. Knowledge of Oracle s AIM methodology, experience writing BR030s, MD050s, MD070s and MD120s. Good business understanding of ERP Oracle Modules such as quoting, order management, supply chain and distribution processes. Excellent time management skills, strong analytical and problem-solving skills. Able to handle the stress and deliver quality as per the defined timelines. Ability to acquire, absorb, and apply sophisticated business knowledge to problems quickly. Preferred Qualifications that Set You Apart: BE/ B. Tech/ MCA Candidate should possess the degree from a recognized University/ Institution only Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. About Us WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . About Emerson Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Senior Data engineer for production support who will provide daily end-to-end support for daily data loads & manage production issues. What will you do: Monitor & support various data loads for our Enterprise Data Warehouse. Support business users who are accessing POWER BI dashboards & Datawarehouse tables. Handle incidents, service requests within defined SLA’s. Work with team on managing Azure resources including but not limited to Databricks, Azure Data Factory pipelines, ADLS etc. Build new ETL/ELT pipelines using Azure Data Products like Azure Data Factory, Databricks etc. Help build best practices & processes. Coordinate with upstream/downstream teams to resolve data issues. Work with the QA team and Dev team to ensure appropriate automated regressions are added to detect such issues in future. Work with the Dev team to improve automated error handling so manual interventions can be reduced. Analyze process and pattern so other similar unreported issues can be resolved in one go. What you will need: Strong IT professional with 3-4 years of experience in Data Engineering. The candidate should have strong analytical and problem-solving skills. Must have: 3-4 years of experience in Data warehouse design & development and ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures on MPP platforms - Synapse, Snowflake etc. Experience in analyzing complex code to troubleshoot failure and where applicable recommend best practices around error handling, performance tuning etc. Ability to work independently, as well as part of a team and experience working with fast-paced operations/dev teams. Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modelling Detailed oriented, with the ability to plan, prioritize, and meet deadlines in a fast-paced environment. Can be added to SDE Knowledge of Azure cloud technologies Exceptional problem-solving skills Nice to have: Experience crafting, building, and deploying applications in a DevOps environment utilizing CI/CD tools Understanding of dimensional and relational modeling Relevant certifications Basic knowledge of Power BI. Who are you: Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:99740 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Data engineer for CDAO Product Support Group, who will provide daily end-to-end support for daily data loads and manage production issues. What will you do: Monitor & support various data loads for our Enterprise Data Warehouse. Support business users who are accessing POWER BI dashboards & Datawarehouse tables. Handle incidents, service requests within defined SLA’s. Work with team on managing Azure resources including but not limited to Databricks, Azure Data Factory pipelines, ADLS etc. Build new ETL/ELT pipelines using Azure Data Products like Azure Data Factory, Databricks etc. Help build best practices & processes. Coordinate with upstream/downstream teams to resolve data issues. Work with the QA team and Dev team to ensure appropriate automated regressions are added to detect such issues in future. Work with the Dev team to improve automated error handling so manual interventions can be reduced. Analyze process and pattern so other similar unreported issues can be resolved in one go. Strong IT professional with 2-3 years of experience in Data Engineering. The candidate should have strong analytical and problem-solving skills. Must have: 2-3 years of experience in Data warehouse design & development and ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures on MPP platforms - Synapse, Snowflake etc. Experience in analyzing complex code to troubleshoot failure and where applicable recommend best practices around error handling, performance tuning etc. Ability to work independently, as well as part of a team and experience working with fast-paced operations/dev teams. Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modelling Knowledge of Azure cloud technologies Exceptional problem-solving skills Nice to have: Experience crafting, building, and deploying applications in a DevOps environment utilizing CI/CD tools Understanding of dimensional and relational modeling Relevant certifications Basic knowledge of Power BI. Who are you: Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:99773 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less
Posted 1 month ago
9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're looking for a Principal IT Engineer - India This role is Office Based Cornerstone is looking for a Principal IT Engineer (Oracle cloud EPM) who will provide domain and functional expertise, and partner with FP&A and Accounting teams in managing the EPBCS/FCCS applications on Oracle EPM Cloud. This techno functional role will be responsible for managing all changes to the EPM platform, leveraging on skills and technologies in delivering day to day production support as well as enhancements. In this role you will… Act as the solution architect for the Oracle Cloud EPM platform (with focus on EPBCS, FCCS) driving end to end solution delivery. Partner with our FP&A team to ensure successful delivery of end-to-end solutions for Planning and Forecasting (workforce planning, Vendor Planning and Financial planning applications within EPBCS) and address reporting needs. Partner with our Accounting team on Financial consolidation and close process and help manage and support close process in FCCS and reporting needs in SmartView Support the Oracle EPBCS application, working along with Oracle ERP and other IT teams. Manage and support all EPBCS and FCCS data integrations and metadata for financial actuals and metrics an ongoing basis. Support loading integration files to the system during the budgeting and reforecast process working closely with the FP&A team. Build reports in FCCS, EPBCS and SmartView, EPBCS models, forms, and calculations. Resolve reported functional/technical issues on FCCS and EPBCS, Manage backlog of enhancements. If needed, open tickets with Oracle Support and work with Oracle on priority until the issue is resolved. Evaluate requests for changes to production application, recommend systems enhancements, and lead the development effort for solutions by following established governance processes. Support the FP&A / Accounting teams to reengineer/develop reports and analyses relating to the monthly close and quarterly reporting requirements. Support the Oracle quarterly upgrade process by performing regression testing and reviewing new features in FCCS and PBCS with Accounting & FP&A teams. Ensure smooth period-end and close activities, ensuring system resources are available. Regularly review system capacity and memory to ensure system availability even at peak usage. If needed, work with Oracle to request for additional memory. Be a key contributor in the future development and implementation of other Oracle EPM applications (ARCS - Account Reconciliation Cloud Service, Narrative Reporting, and EDMCS - Enterprise Data Management Cloud Service. You Have What It Takes If You Have… Hands-on experience in requirements gathering, designing, configuring, and scripting for EPBCS / FCCS application implementation and maintenance. Completed at least two end to end cycles of system implementation. Bachelor’s degree in IT or Finance, with minimum 9 years of experience implementing / supporting Oracle EPBCS, FCCS and Cloud ERP applications – working closely with and supporting the FP&A, Accounting business is a plus. Thorough knowledge and independent preparation of project documentations including Requirement gathering, Solution Design document, UAT scripts, training material preparations Good understanding of Basic Accounting & Financial consolidation process, Financial Statements (P/L, B/S, C/F). Familiarity with GAAP is nice to have but not required. In-depth understanding of Planning & Budgeting process of an Organization and map with Oracle Solution Understanding of security concepts such as SOD, SSO or GDPR. Proficiency with various reporting tools including, EPM Financial/Narrative Reporting and Oracle Smart View for Office against EPM Extensive experience with various reporting tools including EPBCS , SmartView Experience integrating Cloud Services (interfaces, conversions, and creating applications using FBDI, ADFDI, Oracle ERP Cloud SOAP/REST API, and ADF). Ability to communicate complex technology solutions to diverse teams - technical, business and management team. Strong written and oral communication skills; Ability to communicate effectively with technical and non-technical staff. Sharp problem-solving, analytical, and innovation skills as demonstrated by the ability to identify the source of a problem and determine the appropriate solutions; resolve and prioritize sensitive problems. Basic project management skills to coordinate project implementation task timelines & schedules. Familiar with IT ticketing systems like JIRA or Service Now. Familiar with Agile methodology. Willing to take ownership of issues or enhancements within EPM and drive them to completion. Experience working across various IT and business teams to prioritize, design, implement and support business driven functionality. Must be able to multi-task and provide close attention to detail while managing conflicting priorities. Ability to positively contribute to a team atmosphere and positively encourage other team members in meeting objectives. Also able to recognize and adopt to adapt to changing priorities. Any support experience in a large enterprise using Oracle ERP/EPM will be favorably considered. Strong and effective communication skills. Able to listen and understand requirements, organize these into functional and technical specifications, elaborate or present solution options to business or express technical concepts in business terms effectively. Experience in preparing and presenting setup documents (BR100s, MD100s) , solution presentations, lean specifications, test documents, and user training documents. Our Culture Our mission is to empower people, businesses and communities. A culture created less by what we do and more by who we are. When people ask what our team is about, we point to our core values: champion customer success, bring our best, achieve together, get stuff done, and innovate every day. We're always on the lookout for new, curious and capable people who can help us achieve our goal and we are seeking diversity in the people who join our team. We want to make sure that our company reflects the demographic of our customers, clients, and the communities in which we operate. So if you want to work for a friendly, global, inclusive and innovative company, we'd love to meet you! What We Do Cornerstone is a premier people development company. We believe people can achieve anything when they have the right development and growth opportunities. We offer organizations the technology, content, expertise and specialized focus to help them realize the potential of their people. Featuring comprehensive recruiting, personalized learning, modern training content, development-driven performance management and holistic employee data management and insights, Cornerstone’s people development solutions are successfully used by more than 100 million+ people in 180+ countries and in nearly 50 languages. Cornerstone takes special care to ensure the security and privacy of the data of its users. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook ! Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France