Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description WHO WE ARE Goldman Sachs is a leading global investment banking, securities and investment management firm that provides a wide range of services worldwide to a substantial and diversified client base that includes corporations, financial institutions, governments and high net-worth individuals. Founded in 1869, it is one of the oldest and largest investment banking firms. The firm is headquartered in New York and maintains offices in London, Bangalore, Frankfurt, Tokyo, Hong Kong and other major financial centers around the world. We are committed to growing our distinctive Culture and holding to our core values which always place our client's interests first. These values are reflected in our Business Principles , which emphasize integrity, commitment to excellence, innovation and teamwork. Business Unit Overview The Cloud Enablement team within Developer Experience is responsible for enabling the use of public cloud services across the firm. You will be working as part of multi-disciplinary team responsible for researching, architecting and building a cutting-edge platform that enable Goldman Sachs teams to deploy and manage services in public cloud safely and securely. We are at an early stage of modernizing our services around cloud native principles, and you will be directly contributing to platform that programmatically enforces safety, security and compliance of services and enables engineers to innovate faster. How You Will Fulfill Your Potential Contribute to technical solutions, implementation and operational management of Cloud Platforms like Microsoft Azure, Oracle Cloud Infrastructure, AWS and Google Cloud Platform Design, Build, Test and Deploy solutions that will support on-boarding and migrating applications from on-prem to the cloud Participate in various technical and architectural discussions both within the team and across the organization Help to communicate and promote best practices for public cloud application development across the firm Responsibilities RESPONSIBILITIES AND QUALIFICATIONS Design and develop high performance applications using latest technologies (Java/J2EE, Python). Meet with the application users to elicit/understand software requirements, assess feasibility of proposed changes, and compile measures/deliverables. Analyze, design, develop, test, and support enterprise applications. Conduct functional and non-functional testing, perform unit and integration testing and troubleshoot and debug applications. Play a big part in design and implementation in a team oriented environment. Support users in resolving issues by troubleshooting, providing workarounds, or escalating to technology management Provide technical and functional guidance and leadership to junior members on a need basis. Engineer and Implement solutions for Hybrid and Multi-Cloud connectivity, such as connectivity to/from on-premises, connectivity to/from the Internet, and to/from other cloud service providers Design and Development of APIs to manage Public Cloud infrastructure owned by the firm. Management of cloud resources in multiple cloud providers with Infrastructure as code (IaC) tooling such as Hashicorp Terraform Provide clear, reader-appropriate documentation for projects and processes, available for end users, technical support staff, management, and clients/vendors Basic Qualifications Bachelor’s degree or equivalent in computer science engineering or related disciplines. 3-6 years of experience as a Java developer with extensive hands on experience building micro services with Spring Boot. Solid understanding of core Java concepts like collections, multithreading, serialization, lambda, functional interfaces, streams -parallel processing and aggregations. Experience in writing unit tests using frameworks such as Junit/Mockito/Spock or other testing frameworks. Strong technical ability, willingness to learn and evolve your skills with advances in technology. Excellent written and verbal communication skills, including experience speaking to technical and business audiences. Ability to understand and effectively debug both new and existing solutions. Highly motivated and willing to learn and adapt to new technologies Preferred Qualifications Experience in leading Cloud Platforms like AWS, Azure, OCI. Experience in building infrastructure as IaC using Terraform/AWS Cloud Formation Templates or Kubernetes (KRM APIs), Azure DevOps, etc. Ability to reason about performance, security, and process interactions in complex distributed systems Experience with API programming, Scripting, Systems architecture and design, Networking, DevOps, Scaling, Security and Microservices architecture. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Responsibilities: Design, develop, and maintain scalable microservices using Spring Boot and Java (8/11/17). Build robust, reusable modules utilizing Java Streams, Lambda expressions, and core Java features. Develop efficient code for data processing, including character frequency analysis, sorting, and reverse operations. Implement logic-heavy programs using advanced Java constructs and exception handling mechanisms. Work extensively with Java Collections Framework, including HashMap, TreeMap, HashSet, TreeSet, and custom implementations of Comparator and Comparable. Handle file operations for reading/writing files and manage data through serialization/deserialization. Integrate thread synchronization and scheduling for bulk data processing. Apply core design patterns (Singleton, Factory, Observer, etc.) to ensure clean and maintainable code. Build and secure RESTful APIs using Spring Boot annotations and Spring Security. Implement Spring Batch jobs for scheduled/batch processing tasks. Design and develop microservices using Spring Cloud (Eureka, Feign, Config Server, API Gateway, etc.). Follow Microservices Design Patterns like Circuit Breaker, Service Discovery, and API Gateway. Write and optimize complex SQL queries including subqueries, group by, having, joins, and window functions. Work with relational databases ensuring referential integrity, constraints, and performance optimization. Collaborate with cross-functional teams and contribute to architectural decisions. Demonstrate awareness of integrating GenAI solutions, AI-enhanced workflows, and prompt engineering practices (preferred but not mandatory). Eligibility Criteria: Bachelor’s/Master’s degree in Computer Science, Engineering, or related field. 5–8 years of hands-on development experience with Core Java , Spring Boot , and Microservices . Solid programming foundation with ability to solve logic-based problems efficiently. Proficiency in: Java Streams, Lambda expressions Collections framework and data structures File handling, serialization, multithreading Exception handling and debugging Expertise in: Spring Boot: REST APIs, Security, Annotations, Batch Microservices: Spring Cloud, design principles and patterns SQL: complex joins, subqueries, aggregations Exposure to design patterns , GenAI concepts (bonus), and prompt engineering is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration skills. Nice to Have: Experience with GenAI tools like ChatGPT, GitHub Copilot, Gemini. Knowledge of containerization tools like Docker and orchestration platforms like Kubernetes. Familiarity with CI/CD pipelines, logging, and monitoring tools. Show more Show less
Posted 3 weeks ago
0.0 - 3.0 years
0 Lacs
Chandigarh, Chandigarh
On-site
Job Title: MERN Stack Developer (3+ Years Experience) Location: Mohali Job Type : Full-time About Us: At YRJ Technology, we are a leading provider of innovative web solutions, offering custom web and mobile app development services. Our team is focused on delivering high-quality, enterprise-level software solutions that streamline business processes and enhance digital user experiences. We're looking to expand our team with a passionate and skilled MERN Stack Developer to contribute to the development of our exciting projects. Key Responsibilities: · Develop and maintain scalable and high-performance web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js). · Design and implement responsive user interfaces with React.js to ensure a seamless user experience. · Work with Node.js (Nest.Js) to build server-side applications and APIs, integrating with databases, front-end systems, and third-party services. · Collaborate with cross-functional teams (designers, QA engineers, project managers) to deliver high-quality features within project deadlines. · Write clean, maintainable, and efficient code while adhering to best practices and coding standards. · Troubleshoot, debug, and resolve technical issues, improving the overall performance of the application. · Participate in code reviews to maintain code quality and team collaboration. · Stay up to date with the latest industry trends and technologies, and implement new features and tools as needed. · Requirements: · 3+ years of hands-on experience as a MERN Stack Developer or in similar full-stack web development roles. · Proficient in React.js, including hooks, context API, and component-based architecture. · Strong understanding of Node.js and Express.js for building RESTful APIs and back-end services. · Experience working with MongoDB, including data modeling, aggregations, and optimization techniques. · Familiarity with front-end tools and workflows (Webpack, Babel, etc.) and version control systems (Git). · Understanding of responsive design principles and mobile-first web development. · Strong problem-solving skills and the ability to work in a collaborative, fast-paced environment. · Good understanding of agile development methodologies and ability to work effectively within a team. Preferred Skills: · Experience with TypeScript is a plus. · Familiarity with cloud platforms (AWS, Azure, etc.) and containerization (Docker). · Knowledge of front-end state management libraries like Redux. · Experience with testing frameworks (Jest, Mocha, etc.) for both front-end and back-end development. · Understanding of CI/CD pipelines and automated deployment processes. Why YRJ Technology? · Work on exciting and impactful projects across industries. · Be part of a dynamic and collaborative team that values innovation and creativity. · Opportunities for professional growth, skill development, and career advancement. · Competitive salary and benefits package. Job Type: Full-time Pay: ₹30,000.00 - ₹104,033.00 per month Schedule: Rotational shift Ability to commute/relocate: Chandigarh, Chandigarh: Reliably commute or planning to relocate before starting work (Required) Experience: Web development: 3 years (Required) Location: Chandigarh, Chandigarh (Required) Work Location: In person
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description Codenia Technologies LLP specializes in delivering innovative and tailored solutions across web development, mobile applications, and AI-powered tools to help businesses thrive in the digital era. The company focuses on crafting cutting-edge websites, developing intuitive mobile apps, and harnessing AI for smarter, data-driven solutions. Codenia stands out for its quality, creativity, and efficiency in exceeding expectations and fostering long-lasting partnerships. Role Description This is a full-time on-site role located in Gurgaon for a Data Analyst at Codenia Technologies LLP. You need to possess the drive and ability to deliver on projects without constant supervision. Expertise Needed in - Power BI, ETL, Databricks. Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 4+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimisation. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Description: We need a Senior Python and Pyspark Developer to work for a leading investment bank client. Responsibilities: • Develop software applications based on business requirements • Maintain software applications and make enhancements according to project specifications • Participate in requirement analysis, design, development, testing, and implementation activities • Propose new techniques and technologies for software development. • Perform unit testing and user acceptance testing to evaluate application functionality • Ensure to complete the assigned development tasks within the deadlines • Work in compliance with coding standards and best practices • Provide assistance to Junior Developers when needed. • Perform code reviews and recommend improvements. • Review business requirements and recommend changes to develop reliable applications. • Develop coding documentation and other technical specifications for assigned projects. • Act as primary contact for development queries and concerns. • Analyze and resolve development issues accurately. Mandatory Skills: • 8+ years of experience in data intensive Pyspark development. • Experience as a core Python developer. • Experience developing Classes, OOPS, exception handling, parallel processing . • Strong knowledge of DB connectivity, data loading , transformation, calculation. • Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. • Lambda Functions, Decorators. • Vector operations on Pandas dataframes /series. • Application of applymap, apply, map functions. • Concurrency and error handling data pipeline batch of size [1-10 gb]. • Ability to understand business requirements and translate them into technical requirements. • Ability to design architecture of data pipeline for concurrent data processing. • Familiar with creating/designing RESTful services and APIs. • Familiar with application unit tests. • Working with Git source control Service-orientated architecture, including the ability to consider integrations with other applications and services. • Debugging application. Nice-to-Have Skills: • Knowledge of web backend technology - Django, Python, PostgreSQL. • Apache Airflow • Atlassian Jira • Understanding of Financial Markets Asset Classes (FX, FI, Equities, Rates, Commodities & Credit), various trade types (OTC, exchange traded, Spot, Forward, Swap, Options) and related systems is a plus • Surveillance domain knowledge, regulations (MAR, MIFID, CAT, Dodd Frank) and related Systems knowledge is certainly a plus Languages: English: C2 Proficient Show more Show less
Posted 3 weeks ago
5 years
0 Lacs
Gurgaon, Haryana, India
On-site
We are looking for an experienced Data Engineer to design, build, and maintain scalable data pipelines for processing clickstream data in Google Cloud Platform (GCP). The ideal candidate will have 5+ years of experience working with GCP, particularly in eCommerce, and possess a deep understanding of BigQuery and data pipeline automation. This role will involve building robust data workflows using Airflow , handling large data volumes, and ensuring smooth integration with Google Analytics BigQuery data exports. Key Responsibilities Pipeline Development: Design, implement, and maintain automated data pipelines using Airflow to process clickstream data in GCP, ensuring efficiency and reliability. BigQuery Expertise: Leverage BigQuery for data storage, querying, and optimizing the performance of large data sets, ensuring fast query performance on 100B+ row tables. Data Integration: Work closely with the team to integrate clickstream data from various sources, particularly focusing on Google Analytics BigQuery exports, into the data pipeline. Automation & Monitoring: Automate data processing workflows and establish robust monitoring processes to ensure seamless data flow and timely delivery of data to end-users. Data Quality & Optimization: Ensure high-quality data with proper transformations and aggregations. Optimize large data queries to reduce latency and improve processing efficiency. Collaboration: Work closely with cross-functional teams (Data Science, Analytics, Product, and Business teams) to understand their data requirements and deliver solutions. Documentation & Best Practices: Document processes, workflows, and pipeline architecture. Promote best practices for data pipeline development and GCP services usage. Scalability: Design and implement scalable systems that can handle growing volumes of clickstream data in eCommerce applications. Skills & Qualifications Experience: 5+ years of experience in data engineering with a focus on GCP, particularly in eCommerce or digital analytics environments. GCP Expertise: Extensive experience with Google Cloud Platform (GCP) services such as BigQuery , Cloud Storage , Cloud Functions , Pub/Sub , and Dataflow . BigQuery Mastery: Deep understanding of BigQuery for large-scale data processing and optimization, including partitioning, clustering, and query optimization for massive datasets. Data Pipelines: Hands-on experience automating, scheduling, and monitoring data pipelines using Airflow . Handling Large Data Volumes: Experience working with very large datasets (e.g., 100B+ rows) and optimizing data storage and processing for high performance. Clickstream Data: Familiarity with working with clickstream data and integrating it into data pipelines. Google Analytics BigQuery Export: Ideally, experience working with Google Analytics BigQuery data exports and integrating analytics data into a centralized data warehouse. Programming: Strong proficiency in Python for data processing, pipeline orchestration, and automation. SQL Skills: Proficient in writing complex SQL queries for data extraction, transformation, and analysis. Problem Solving: Strong analytical and problem-solving skills, with an ability to troubleshoot and resolve data pipeline and performance issues. Collaboration & Communication: Excellent communication skills to work across teams and explain complex technical concepts to non-technical stakeholders. Preferred Qualifications Experience with eCommerce Analytics: Previous experience working with eCommerce data, particularly clickstream and transactional data. Monitoring & Alerts: Familiarity with setting up monitoring, logging, and alerts to ensure data pipelines are running smoothly and issues are flagged promptly. Cloud Security: Knowledge of data security and access control best practices in the cloud environment (IAM, VPC, etc.). Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Staff- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Snowflake developer with foundational skills in SQL, Python, and data profiling. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop and manage Snowflake data warehouse solutions using best practices. Work closely with business analysts and stakeholders to understand business requirements and translate them into technical solutions. Use Python for data profiling, data validation, and exploratory data analysis (EDA). Create and maintain documentation related to data pipelines and architecture. Write and optimize complex SQL queries for business insights and data analysis. Implement data quality checks, anomaly detection, and validation processes. Skills And Attributes For Success 1–2 years of experience working with Snowflake, SQL, and cloud-based data platforms. Hands-on knowledge of Python for data manipulation (pandas, NumPy) and data profiling techniques. Foundational understanding of ML concepts and ability to assist with dataset preparation. Ability to comprehend and translate business processes into data requirements. Strong problem-solving and analytical skills. Nice to have knowledge of BI tools (e.g., Power BI, Tableau). To qualify for the role, you must have Be a graduate or equivalent with 1 - 2 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Basic to intermediate experience with schema design, SQL scripting, and data loading in Snowflake. Strong ability to write and debug complex SQL queries for joins, aggregations, CTEs, and window functions. Proficiency with Python libraries like pandas, NumPy, and basic data handling logic. Ability to write scripts to clean, transform, and validate datasets. Ability to understand business processes (e.g., Sales, Supply Chain, Finance) and convert those needs into technical data solutions. Basic experience with BI tools like Power BI, Tableau. Ideally, you’ll also have Client management skills What We Look For We are seeking a detail-oriented and proactive Snowflake Developer with 1–2 years of experience in building scalable data solutions. The ideal candidate will possess hands-on experience with Snowflake, a basic understanding of business processes, and practical experience using Python for data profiling . What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview: We are seeking an experienced Azure ETL Tester to validate and verify data pipelines, transformation logic, and data quality across our Azure-based data platform. The ideal candidate will have strong ETL testing skills, deep knowledge of SQL, and hands-on experience with Azure Data Factory (ADF) , Synapse Analytics , and related services. Key Responsibilities: Design and execute test plans for ETL workflows built on Azure Data Platform. Validate data movement, transformation logic, and data loading across systems. Write and execute complex SQL queries for data validation, reconciliation, and defect analysis. Test data pipelines built using Azure Data Factory , Azure Synapse , and Databricks . Identify, report, and track data quality and integrity issues. Collaborate with developers and data engineers to resolve defects and improve data accuracy. Document test cases, test data requirements, test results, and maintain traceability. Support performance and regression testing of ETL jobs. Required Skills: 4–7 years of experience in ETL testing , data validation, and data warehouse projects. Strong SQL skills – ability to write complex joins, aggregations, and data comparison queries. Hands-on experience with Azure Data Factory (ADF) , Azure Synapse Analytics , and/or Azure Databricks . Good understanding of ETL processes, source-to-target mappings, and data profiling. Experience in defect tracking and test management tools (e.g., JIRA, TestRail, HP ALM). Strong analytical and problem-solving abilities. Good communication and documentation skills. Nice to Have: Experience with Python or PowerShell for automation. Familiarity with data quality tools like Great Expectations or Informatica DQ . Exposure to CI/CD pipelines for data testing automation. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Staff- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Snowflake developer with foundational skills in SQL, Python, and data profiling. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop and manage Snowflake data warehouse solutions using best practices. Work closely with business analysts and stakeholders to understand business requirements and translate them into technical solutions. Use Python for data profiling, data validation, and exploratory data analysis (EDA). Create and maintain documentation related to data pipelines and architecture. Write and optimize complex SQL queries for business insights and data analysis. Implement data quality checks, anomaly detection, and validation processes. Skills And Attributes For Success 1–2 years of experience working with Snowflake, SQL, and cloud-based data platforms. Hands-on knowledge of Python for data manipulation (pandas, NumPy) and data profiling techniques. Foundational understanding of ML concepts and ability to assist with dataset preparation. Ability to comprehend and translate business processes into data requirements. Strong problem-solving and analytical skills. Nice to have knowledge of BI tools (e.g., Power BI, Tableau). To qualify for the role, you must have Be a graduate or equivalent with 1 - 2 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Basic to intermediate experience with schema design, SQL scripting, and data loading in Snowflake. Strong ability to write and debug complex SQL queries for joins, aggregations, CTEs, and window functions. Proficiency with Python libraries like pandas, NumPy, and basic data handling logic. Ability to write scripts to clean, transform, and validate datasets. Ability to understand business processes (e.g., Sales, Supply Chain, Finance) and convert those needs into technical data solutions. Basic experience with BI tools like Power BI, Tableau. Ideally, you’ll also have Client management skills What We Look For We are seeking a detail-oriented and proactive Snowflake Developer with 1–2 years of experience in building scalable data solutions. The ideal candidate will possess hands-on experience with Snowflake, a basic understanding of business processes, and practical experience using Python for data profiling . What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You’ll Do "There is no better time to join Eaton than in this exciting era of power management. We're reimagining innovation by adapting digital technologies — connected devices, data models, and insights — to transform power management for safer, more sustainable, and more efficient power use. Our teams are collaborating to build the best digital solutions for our customers. Role We are looking forward to Data Engineer based in Pune, India. In Eaton, making our work exciting, engaging, and meaningful; ensuring safety, health, and wellness; and being a model of inclusion & diversity are already embedded in who we are - it’s in our values, part of our vision, and our clearly defined aspirational goals. This exciting role offers an opportunity to: Eaton Corporation’s Center for Intelligent Power has an opening for a Data Engineer As a Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure and systems. You will collaborate with cross-functional teams to understand data requirements, implement data pipelines, and ensure the availability, reliability, and scalability of our data solutions. You can program in several languages and understands the end-to-end software development cycle including CI/CD and software release. you will also be responsible for developing and maintaining Power BI reports and dashboards, ensuring they meet business requirements and provide actionable insights. You will work closely with stakeholders to gather requirements and deliver high-quality data visualizations that drive decision-making " "* Design, develop, and maintain scalable data pipelines and data integration processes to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake. Collaborate with stakeholders to understand data requirements and translate them into efficient and scalable data engineering solutions. Optimize data models, database schemas, and data processing algorithms to ensure efficient and high-performance data storage and retrieval. Implement and maintain data quality and data governance processes, including data cleansing, validation, and metadata management. Work closely with data scientists, analysts, and business intelligence teams to support their data needs and enable data-driven decision-making. Develop and implement data security and privacy measures to ensure compliance with regulations and industry best practices. Monitor and troubleshoot data pipelines, identifying and resolving performance or data quality issues in a timely manner. Stay up to date with emerging technologies and trends in the data engineering field, evaluating and recommending new tools and frameworks to enhance data processing and analytics capabilities. Build insights using various BI tools - Power BI Collaborate with infrastructure and operations teams to ensure the availability, reliability, and scalability of data systems and infrastructure. Mentor and provide technical guidance to junior data engineers, promoting best practices and knowledge sharing."" " Qualifications "Required: Bachelor's degree from an accredited institution " 3+ years of experience in Data Engineering and Power BI 3+ years of experience in data analytics Skills Apache Spark, Python Azure experience (Data Bricks, Docker, Function App) Git Working knowledge of Airflow Knowledge of Kubernetes and Docker Power BI: Data Visualization: Proficient in creating interactive and visually appealing dashboards and reports using Power BI. Data Modeling: Experience in designing and implementing data models, including relationships, hierarchies, and calculated columns/measures. DAX (Data Analysis Expressions): Strong knowledge of DAX for creating complex calculations and aggregations. Power Query: Skilled in using Power Query for data transformation and preparation. Integration: Ability to integrate Power BI with various data sources such as SQL databases, Excel, and cloud services. Performance Optimization: Experience in optimizing Power BI reports for performance and scalability. Security: Knowledge of implementing row-level security and managing user access within Power BI. Collaboration: Experience in sharing and collaborating on Power BI reports and dashboards within an organization. Best Practices: Familiarity with Power BI best practices and staying updated with the latest features and updates. Proficient with using Power BI Background in SQL and experience working with relational databases. Bachelor Degree in Computer Science or Software Engineering or Information Technoogy Experience on Cloud Development Platforms - Azure & AWS and their associated data storage options Experience on CI/CD (Continuous Integration/Delivery) i.e. Jenkins, GIT, Travis-CI Virtual build environments (Container, VMs and Microservices) and Container orchestration - Docker Swarm, Kubernetes/Red Hat Openshift. Relational & non-relational database systems - SQL, Postgres SQL, NoSQL, MongoDB, CosmosDB " ]]> Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
India
On-site
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Workday and Adaptive Reporting consultant with experience in Snowflake and BI Reporting. Veradigm is seeking a results-driven Workday & Adaptive Planning Reporting Consultant with over 5 years of experience in enterprise reporting and analytics. The ideal candidate will have advanced skills in Workday Report Writer, Adaptive Planning (formerly Adaptive Insights), and strong familiarity with Snowflake and modern Business Intelligence (BI) tools such as Tableau, Power BI, or Looker. This role is integral in enabling data-driven decisions through the delivery of scalable, insightful, and reliable reporting solutions across HR, Finance, and operational teams. Key Responsibilities Develop, enhance, and maintain custom and advanced reports in Workday (HCM, Financials) and Adaptive Planning to support FP&A, HR, and executive dashboards. Design multi-dimensional models and complex calculated fields to deliver dynamic, contextual, and role-based reports in Workday and Adaptive Planning. Integrate financial and operational data from Workday and Snowflake to build robust data models and reporting pipelines. Build and automate reporting solutions using BI platforms (e.g., Tableau, Power BI, Looker) for cross-platform analysis and executive visualization. Partner with Finance, HR, and IT to gather reporting requirements and translate them into effective data visualizations and reports. Perform data validation, reconciliation, and quality assurance across systems, including Workday, Adaptive, and Snowflake. Implement security and governance best practices to ensure compliant access and distribution of reports and data assets. Collaborate with data engineering teams on data pipeline optimization and Snowflake query performance tuning for reporting workloads. Support Workday releases and Adaptive upgrades by assessing impacts on existing reports and dashboards. Document data models, business logic, report definitions, and metadata for governance and audit readiness. Skills And Experience 5+ years of experience in enterprise reporting, including: 3+ years of experience with Workday reporting (custom/advanced/composite reports, dashboards, calculated fields) 2+ years of experience in Workday Adaptive Planning (sheets, modeled and cube sheets, reporting, office connect) 2+ years of experience working with Snowflake (writing SQL queries, working with views, managing datasets for reporting) Proficiency in BI tools (e.g., Tableau, Power BI, Looker) for designing and publishing interactive dashboards. Strong understanding of data modeling, KPIs, financial planning, and reporting metrics across HR and Finance domains. Excellent SQL skills and ability to perform complex joins, unions, and aggregations across large datasets. Strong analytical thinking, attention to detail, and ability to translate business needs into technical solutions. Workday Certifications in Reporting or Adaptive Planning is preferred. Familiarity with data lake architectures, ETL/ELT pipelines, or Snowflake data sharing concepts. Knowledge of SaaS data integrations, APIs, and middleware tools (e.g., Boomi, MuleSoft) for cross-platform data synchronization. Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself! Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
India
On-site
Overview Build the Future At McGraw Hill we create best-in-class, next-generation learning platforms that are used by millions of students and educators worldwide from kindergarten through graduate school. Our goal is to accelerate student success through intuitive and effective learning tools and content that maximize a teacher’s time and a student’s learning experience. Our engineering team drive progress and help build the future of learning. If you have the passion and technical expertise to thrive in an innovative and agile environment, we want to learn more about you. What is this role about? McGraw-Hill Education, the leading provider of digital and print educational materials is looking for a Senior Data Engineer for our Data Analytics Group. The Senior Data Engineer in Data and Analytics is responsible for enhancing McGraw-Hill Education’s (MHE) business intelligence and data services capabilities, ensuring the delivery of actionable and timely insights to support financial, product, customer, user, and third-party data. This role also involves managing and monitoring the performance of the Data Platform, ensuring efficiency and reliability with hands-on data engineering, designing and architecting dynamic reporting, analytics, and modeling solutions to drive success in the education domain. The ideal candidate will have a strong data engineering background, with expertise in Oracle Cloud Infrastructure (OCI) with Exadata, Informatica Intelligent Cloud Services (IICS) and/or Databricks, AWS, with tht advanced proficiency in SQL queries. Additionally, this role requires close collaboration with stakeholders to ensure the successful delivery of projects. What you will be doing: Senior Data Engineer must have prior hands-on experience developing and delivering data solution with AWS and/Or Oracle technologies. Strong knowledge working with data from financial and operational systems, such as Oracle ERP Sales, Oracle DB and data modelling architecture with slow changing dimension (SCD). Experience in running cloud platform with optimized solution architecture with the ability to meet the daily runbook SLA. Strong experience with version control software like GIT and project management software like Jira with Agile/Kanban. Strong experience with Data Modelling concepts and Modern data architecture including cloud technologies. Ability to translate business requirements into technical requirements and deliveries. Design and develop parallel processing ETL solutions for optimal resource usage and faster processing. Understand ETL specification documents for mapping requirements and create mappings using transformations such as the Aggregator, Lookup, Router, Joiner, Union, Sorter, Normalizer and Update Strategy. Create UNIX shell scripts as Informatica workflow wrapper and perform housekeeping activities like cleanup and archive files. Experience in technical specification design - Proven experience in designing and building integrations supporting standard data modelling objects (Fact dimensions, aggregations, star schema, etc.) Ability to provide end-to-end technical guidance on the software development life cycle (requirements through implementation). Ability to create high quality solution design documentation for end-to-end solutions What you need to be considered: Expertise in Data warehousing and modern data lake concepts. 5+ years of experience in Data Engineering using tools such as:Informatica/IICS, Oracle DB and Oracle packages. AWS services. Data platforms like Athena with iceberg, lambda, EMR, and glue, Data bricks. Scripting languages like Python, Scala, Java or node. 1+ years of experience in Unix shell scripting 3+ years of experience working with Cloud like OCI, AWS, and Azure on Data technologies. Preferred: Experience with Publication and Education domain. Prior experience or familiarity with Tableau/Alteryx. Experience working with financial data like sales, revenue, cogs and manufacturing etc. Experience with IBM planning Analytics (TM1). Why work for us? At McGraw Hill, you will be empowered to make a real impact on a global scale. Every day your individual efforts contribute to the lives of millions. There has never been a better time to join McGraw Hill. In our culture of curiosity and innovation, you will be able to own your growth and develop as we do. The work you do at McGraw Hill will be work those matters. We are collectively designing content that will build the future of education. Play your part and experience a sense of fulfilment that will inspire you to even greater heights. If you are curious, open to new ideas and ready to make a difference, we want to talk to you. We have a collective passion for the work we do and a curiosity to find new solutions. If you share our determination, together we will drive learning forward. Here’s what we offer: At McGraw Hill, you will be empowered to make a real impact on a global scale. Every day your individual efforts can contribute to the lives of millions. McGraw Hill recruiters always use a “@mheducation.com” email address and/or from our Applicant Tracking System, iCIMS. Any variation of this email domain should be considered suspicious. Additionally, McGraw Hill recruiters and authorized representatives will never request sensitive information in email. 48831 Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Location : 100% onsite Gurgaon. Job Description We are seeking a Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. Experience 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including proficiency in Databricks Notebooks for development and testing. Key Responsibilities Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing. Design and implement Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models. Manage and optimize Databricks clusters for performance, cost efficiency, and scalability. Use Databricks SQL for advanced query development, data aggregation, and transformation. Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes. Develop solutions to integrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration. Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics. Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations. Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance. Configure and manage Power BI workspaces, gateways, and permissions for secure data access. Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies. Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources. Provide end-user training and support for Power BI adoption across the organization. Develop and maintain ETL/ELT workflows, ensuring high data quality and reliability. Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies. Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems. Collaborate with data governance teams to enforce standards for metadata management and audit trails. Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems. Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy. Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies. Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks. Contributes to the enhancement of organizational analytics maturity through scalable and reusable Skills : Self-Management : You need to possess the drive and ability to deliver on projects without constant supervision. Technical : This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic : You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language : Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication : Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written Skills And Qualifications : Bachelors degree in Computer Science, Data Science, or a related field (Masters (Preferred) : Microsoft Certified: Azure Data Engineer Associate. Databricks Certified Data Engineer Professional. Microsoft Certified : Power BI Data Analyst Associate. 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including : Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectation. (ref:hirist.tech) Show more Show less
Posted 4 weeks ago
10 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Job Information Date Opened 05/14/2025 Job Type Full time Industry Technology State/Province Karnataka Zip/Postal Code 560038 City Bangalore Country India About Us At Innover, we endeavor to see our clients become connected, insight-driven businesses. Our integrated Digital Experiences, Data & Insights and Digital Operations studios help clients embrace digital transformation and drive unique outstanding experiences that apply to the entire customer lifecycle. Our connected studios work in tandem to reimagine the convergence of innovation, technology, people, and business agility to deliver impressive returns on investments. We help organizations capitalize on current trends and game-changing technologies molding them into future-ready enterprises. Take a look at how each of our studios represents deep pockets of expertise and delivers on the promise of data-driven, connected enterprises. Job Description Job Summary: We are seeking a highly experienced and driven Cloud Data Architect to architect and lead the development of an enterprise-grade data platform using Microsoft Fabric. This role involves overseeing the full data lifecycle — from ingestion and transformation to analytics, governance, and deployment — by integrating Azure-native technologies such as Data Factory, Synapse Pipelines, Delta Lake, Spark, and Power BI within the Microsoft Fabric ecosystem. Key Responsibilities: Design and implement scalable, secure, and modular Lakehouse architectures in OneLake using Delta Lake and Fabric-native services. Architect medallion-layered data models (Bronze, Silver, Gold) that support batch and real-time analytics. Support data modeling for analytics, including star/snowflake schemas and semantic models for Power BI. Design and build robust data pipelines using Azure Data Factory, Synapse Pipelines, and Spark Notebooks (via Synapse or Fabric). Implement complex ETL/ELT logic, including joins, aggregations, SCDs, window functions, and schema evolution. Integrate a broad range of data sources — structured, semi-structured, and unstructured — from SQL, REST APIs, Event Hubs, Blob Storage, SharePoint, and more. Enable real-time streaming and event-driven processing with Azure Event Hubs, Kafka, or IoT Hub. Establish automated data quality checks, validation rules, and anomaly detection using Azure Data Quality, custom Python scripts, or Data Activator. Implement robust data lineage tracking, auditing, and error-handling mechanisms across the pipeline lifecycle. Enforce enterprise data governance using Microsoft Purview, including metadata management, data masking, RBAC, Managed Identities, and Key Vault integration. Enable low-latency BI through DirectLake mode and semantic models in Power BI. Develop KQL-based Real-Time Analytics solutions within Fabric for instant insights and monitoring. Establish data security architecture including RBAC, data masking, encryption, and integration with Key Vault and Entra ID (Azure AD). Set up CI/CD pipelines using Azure DevOps, GitHub Actions, and infrastructure-as-code tools like ARM, Bicep, or Terraform for deploying data infrastructure and Fabric artifacts. Promote version control, environment promotion, and release management strategies in Fabric Workspaces. Collaborate closely with BI developers and application teams to support integrated and analytics-ready data delivery. Define and maintain enterprise-wide data architecture standards, principles, and best practices. Mentor junior engineers, lead code reviews, establish engineering best practices, and maintain high standards of documentation and automation. Qualifications: 10+ years of experience in data engineering, with at least 2+ years in technical lead or architect roles. Proven expertise in Azure Data Factory, Synapse Analytics, Delta Lake, Azure Data Lake, and Power BI. Strong proficiency in Python, Spark, and SQL. Deep experience with data integration (REST APIs, SharePoint, Blob), event-driven pipelines (Event Hubs, Kafka), and streaming ingestion. Hands-on experience with data quality frameworks, data validation logic, and automated anomaly detection. Knowledge of Fabric Workspace lifecycle management, including CI/CD deployment strategies. Strong grasp of data warehousing, datalakehouse patterns, and governance frameworks (Purview, RBAC, data masking). Familiarity with real-time analytics using KQL and Data Activator within Fabric. Excellent leadership, communication, and cross-functional collaboration skills. Good to Have: Microsoft certifications such as: Azure Solutions Architect Azure Data Engineer Associate Microsoft Fabric certifications Industry-specific experience in finance, healthcare, manufacturing, or retail.
Posted 1 month ago
0.0 years
0 Lacs
Gurugram, Haryana
On-site
About the Role: Grade Level (for internal use): 10 Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of ‘big data’ from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 6+ years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python/ PySpark etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers – GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres and MongoDB. Experience workflow management tools: Airflow, AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade: 10 Location: Gurugram Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST What You'll Love About Us – Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy – we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! About automotiveMastermind: Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. We’re an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of “Drive” and “Help” have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315747 Posted On: 2025-05-13 Location: Gurgaon, Haryana, India
Posted 1 month ago
0.0 years
0 Lacs
Gurugram, Haryana
On-site
Senior Data Engineer Gurgaon, India Information Technology 315747 Job Description About The Role: Grade Level (for internal use): 10 Position Summary Our Proprietary Software-As-A-Service Helps Automotive Dealerships And Sales Teams Better Understand And Predict Exactly Which Customers Are Ready To Buy, The Reasons Why, And The Key Offers And Incentives Most Likely To Close The Sale. Its Micro-Marketing Engine Then Delivers The Right Message At The Right Time To Those Customers, Ensuring Higher Conversion Rates And A Stronger ROI. What You'll Do You Will Be Part Of Our Data Platform & Product Insights Data Engineering Team. As Part Of This Agile Team, You Will Work In Our Cloud Native Environment To Build & Support Data Ingestion And Processing Pipelines In Cloud. This Will Entail Extraction, Load And Transformation Of ‘Big Data’ From A Wide Variety Of Sources, Both Batch & Streaming, Using Latest Data Frameworks And Technologies Partner With Product Team To Assemble Large, Complex Data Sets That Meet Functional And Non-Functional Business Requirements, Ensure Build Out Of Data Dictionaries/Data Catalogue And Detailed Documentation And Knowledge Around These Data Assets, Metrics And KPIs. Warehouse This Data, Build Data Marts, Data Aggregations, Metrics, KPIs, Business Logic That Leads To Actionable Insights Into Our Product Efficacy, Marketing Platform, Customer Behaviour, Retention Etc. Build Real-Time Monitoring Dashboards And Alerting Systems. Coach And Mentor Other Team Members. Who You Are 6+ Years Of Experience In Big Data And Data Engineering. Strong Knowledge Of Advanced SQL, Data Warehousing Concepts And DataMart Designing. Have Strong Programming Skills In SQL, Python/ PySpark Etc. Experience In Design And Development Of Data Pipeline, ETL/ELT Process On-Premises/Cloud. Experience In One Of The Cloud Providers – GCP, Azure, AWS. Experience With Relational SQL And NoSQL Databases, Including Postgres And MongoDB. Experience Workflow Management Tools: Airflow, AWS Data Pipeline, Google Cloud Composer Etc. Experience With Distributed Versioning Control Environments Such As GIT, Azure DevOps Building Docker Images And Fetch/Promote And Deploy To Production. Integrate Docker Container Orchestration Framework Using Kubernetes By Creating Pods, Config Maps, Deployments Using Terraform. Should Be Able To Convert Business Queries Into Technical Documentation. Strong Problem Solving And Communication Skills. Bachelors Or An Advanced Degree In Computer Science Or Related Engineering Discipline. Good To Have Some Exposure To Exposure To Any Business Intelligence (BI) Tools Like Tableau, Dundas, Power BI Etc. Agile Software Development Methodologies. Working In Multi-Functional, Multi-Location Teams Grade: 10 Location: Gurugram Hybrid Model: Twice A Week Work From Office Shift Time: 12 Pm To 9 Pm IST What You'll Love About Us – Do Ask Us About These! Total Rewards. Monetary, Beneficial And Developmental Rewards! Work Life Balance. You Can't Do A Good Job If Your Job Is All You Do! Prepare For The Future. Academy – We Are All Learners; We Are All Teachers! Employee Assistance Program. Confidential And Professional Counselling And Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow With Us! About automotiveMastermind: Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. We’re an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of “Drive” and “Help” have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315747 Posted On: 2025-05-13 Location: Gurgaon, Haryana, India
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
Hybrid
Qualifications and SkillsExtensive experience in data modeling techniques and practices, crucial for crafting scalable and optimized data models.Proficiency in Power BI development and administration to support robust reporting and analytics.Strong command of DAX (Mandatory skill) for developing intricate calculations and aggregations within Power BI solutions.Advanced expertise in handling Snowflake (Mandatory skill) for adept cloud-based data warehousing capabilities.Knowledge of Data Lakes (Mandatory skill) for managing large volumes of structured and unstructured data efficiently.Solid understanding of ETL tools, necessary for the efficient extraction, transformation, and loading of data.Exceptional SQL skills to design, query, and manage complex databases for data-driven decisions.Experience in data warehousing concepts and architectures to support structured and systematic data storage. Roles and ResponsibilitiesArchitect and implement cutting-edge Power BI solutions that transform business requirements into insightful dashboards and reports.Collaborate with cross-functional teams to gather and analyze data requirements, ensuring alignment with business objectives.Design and optimize data models using advanced techniques for improved performance and scalability in Power BI.Leverage DAX to create complex calculations and custom metrics, enhancing the depth and quality of analytical outputs.Utilize Snowflake to manage and optimize cloud-based data warehousing solutions for seamless data integration.Implement and administer data lakes to efficiently handle and store large datasets, both structured and unstructured.Ensure data accuracy, validity, and security through meticulous data quality checks and validation processes.Stay updated with the latest industry trends and best practices to continuously improve BI solutions and strategies.
Posted 1 month ago
5 years
0 Lacs
Chennai, Tamil Nadu, India
Hybrid
Skills: Java, Python (Programming Language), Hadoop, Hive, PySpark, ETL, SQL, Hiring!!! We are hiring for one of our client!!! Position - Big Data Engineer Experience - 5+years Location - Chennai, Bangalore Work Mode - Hybrid Notice Period - Immediate Joiners(Seving Notice Period within 15days only) Job Overview We are seeking a skilled Big Data Developer to join our team in a mid-level position. This full-time, hybrid role is based in Chennai and Bangalore. The ideal candidate will have 4 to 6 years of experience in big data technologies, contributing to developing and optimizing high-quality data pipelines and applications. Qualifications And Skills Proficiency in Java (Mandatory skill) is essential for developing robust and scalable data processing applications. Expertise in Python (Mandatory skill) for efficient data manipulation and automation of data processing tasks. In-depth knowledge of ETL (Mandatory skill) processes to extract, transform, and load data from various sources into centralized databases. Experienced in working with Hadoop ecosystems, leveraging distributed computing for processing large datasets efficiently. Proficient in utilizing Hive for executing queries and managing large datasets in a Hadoop environment. Skilled in PySpark for real-time data processing and analytics, optimizing data workflows within a big data framework. Strong command over SQL for database querying, involving complex joins, aggregations, and performance tuning. Problem-solving and analytical skills to troubleshoot data issues and enhance data architectures. Roles And Responsibilities Design and implement scalable data pipelines for batch and stream processing to handle large datasets efficiently. Collaborate with data scientists and architects to integrate analytics and reporting solutions. Optimize ETL processes to ensure data accuracy and integrity across various platforms. Develop and maintain data workflows using Hadoop and related technologies to support data projects. Enhance data models and data architecture to improve data quality and accessibility. Identify and resolve data discrepancies and implement enhancements to improve system performance. Stay updated with the latest big data technologies and incorporate them into current systems when beneficial. #bigdata #engineer #java #python #hadoop #chennai #bangalore #hybrid #immediate #hiring Interested candidates share the CV - naveenvj@chiselontechnologies.com Thanks In Advance Chiselon
Posted 1 month ago
8 years
0 Lacs
Gurugram, Haryana, India
On-site
Location - 100% onsite – GurgaonExperience- 8+ years of experience in analytics, data integration, and reporting.4+ years of hands-on experience with Databricks, including:Proficiency in Databricks Notebooks for development and testing. We are seeking a Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision-making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. KEY RESPONSIBILITIES: Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing.Design and implement Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models.Manage and optimize Databricks clusters for performance, cost efficiency, and scalability.Use Databricks SQL for advanced query development, data aggregation, and transformation.Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes.Develop solutions to integrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration.Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics.Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations.Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance.Configure and manage Power BI workspaces, gateways, and permissions for secure data access.Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies.Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources.Provide end-user training and support for Power BI adoption across the organization.Develop and maintain ETL/ELT workflows, ensuring high data quality and reliability.Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies.Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems.Collaborate with data governance teams to enforce standards for metadata management and audit trails.Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems.Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy.Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies.Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks.Contributes to the enhancement of organizational analytics maturity through scalable and reusable architectures. REQUIRED SKILLS: Self-Management – You need to possess the drive and ability to deliver on projects without constant supervision.Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking.Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes.Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful.Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred).Certifications (Preferred):Microsoft Certified: Azure Data Engineer AssociateDatabricks Certified Data Engineer ProfessionalMicrosoft Certified: Power BI Data Analyst Associate8+ years of experience in analytics, data integration, and reporting.4+ years of hands-on experience with Databricks, including:Proficiency in Databricks Notebooks for development and testing.Advanced skills in Databricks SQL, Python, and/or Scala for data engineering.Expertise in cluster management, auto-scaling, and cost optimization.4+ years of expertise with Power BI, including:Advanced DAX for building measures and calculated fields.Proficiency in Power Query for data transformation.Deep understanding of Power BI architecture, workspaces, and row-level security.Strong knowledge of SQL for querying, aggregations, and optimization.Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend.Proficiency in Azure cloud platforms and their application to analytics solutions.Strong analytical thinking with the ability to translate data into actionable insights.Excellent communication skills to effectively collaborate with technical and non-technical stakeholders.Ability to manage multiple priorities in a fast-paced environment with high customer expectation.
Posted 1 month ago
5 - 8 years
0 Lacs
Noida, Uttar Pradesh, India
Hybrid
5+ years of experience in SAP ABAPLocation - Noida Job DescriptionDDIC - Domain, Data Element, Structures, Tables, Views, Primary & Secondary Indexes, Table Buffering, Search Helps, Open SQL & Performance Optimization of database queriesABAP Reports - Classical Report, ALV Reports including OOPS method, Interactive ReportsDynpro Screens - Development of Dynpro Screens, Screen Error Handling, Program Interface, GUI Title, GUI Status, Sub screen, tab strip Control, Table Controls, Context Menus, Splitter ControlEnhancements - Enhancing DDIC Objects, Customer Exits, Business Transaction Events, BADI, Explicit & Implicit EnhancementsForms - Smart Forms, PDF based Print forms & Adobe FormsIntegration Technologies - ALE/EDI/IDOC Interfaces, JSON/XML Interfaces (Optional), REST/SOAP based interfaces (Optional)Using Eclipse Development Environment, ADT Tools, Programming Techniques & Debugging in HANA EnvironmentHANA DB - Performance Optimization of database queries with HANA DBFiori/UI5 DevelopmentCDS - Define CDS Views, ABAP Annotations, SQL Expressions, SQL Functions, Nested Views, Aggregations, Additional join types, UNION, CDS View Enhancement, Implicit Authorization ChecksAMDP Views, BRF+ & HANA Proxy Objects, SAP S/4HANA Extensibility overview, In App & Side-by-Side Extensibility, S4HANA Conversion Projects, Oral communication and articulationAbility to learn | Ability to think differently | Ability to provide solution with limited detailsExperience in Support project, Implementation project, upgrade and Migration project, ECC landscape, S/4HANA landscape
Posted 1 month ago
0 - 2 years
0 Lacs
Gurgaon, Haryana, India
On-site
About The Role Grade Level (for internal use): 08 Position Summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of ‘big data’ from a wide variety of sources, both batch & streaming, using latest data frameworks and technologiesPartner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs.Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc.Build real-time monitoring dashboards and alerting systems. Who You Are 1+ years of experience in Big Data and Data Engineering.Good knowledge of advanced SQL, data warehousing concepts and DataMart designing.Have good programming skills in SQL, Python/ PySpark etc.Experience in development of data pipeline, ETL/ELT process on-premises/cloud.Experience in any one of the Cloud providers – GCP, Azure, AWS.Experience with relational SQL and NoSQL databases, including Postgres and MongoDB.Experience workflow management tools: Airflow, AWS data pipeline, Google Cloud Composer etc.Experience with Distributed Versioning Control environments such as GIT, Azure DevOpsBuilding Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform.Should be able to convert business queries into technical documentation.Strong problem solving and communication skills.Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.Agile software development methodologies.Working in multi-functional, multi-location teams What You'll Love About Us – Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards!Work Life Balance. You can't do a good job if your job is all you do!Prepare for the Future. Academy – we are all learners; we are all teachers!Employee Assistance Program. Confidential and Professional Counselling and Consulting.Diversity & Inclusion. HeForShe!Internal Mobility. Grow with us! About AutomotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. We’re an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of “Drive” and “Help” have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What We Do Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body.Flexible Downtime: Generous time off helps keep you energized for your time on.Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills.Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs.Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families.Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH203 - Entry Professional (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning)
Posted 1 month ago
0 years
0 Lacs
Mumbai, Maharashtra, India
Hybrid
Whether you’re at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you’ll have the opportunity to expand your skills and make a difference at one of the world’s most global banks. We’re fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You’ll also have the chance to give back and make a positive impact where we live and work through volunteerism. We’re currently looking for a high caliber professional to join our team as Senior Vice-President, Risk Reporting Sr. Officer based in Mumbai, India. Being part of our team means that we’ll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance: Citi provides programs and services for your physical and mental well-being including access to telehealth options, health advocates, confidential counseling and more. Coverage varies by country.We believe all parents deserve time to adjust to parenthood and bond with the newest members of their families. That’s why in early 2020 we began rolling out our expanded Paid Parental Leave Policy to include Citi employees around the world.We empower our employees to manage their financial well-being and help them plan for the future.Citi provides access to an array of learning and development resources to help broaden and deepen your skills and knowledge as your career progresses.We have a variety of programs that help employees balance their work and life, including generous paid time off packages.We offer our employees resources and tools to volunteer in the communities in which they live and work. In 2019, Citi employee volunteers contributed more than 1 million volunteer hours around the world. Citi’s Risk Management organization oversees risk-taking activities and assesses risks and issues independently of the front line units. We establish and maintain the enterprise risk management framework that ensures the ability to consistently identify, measure, monitor, control and report material aggregate risks. The USPB Risk and Wealth Risk Chief Administrative Office (CAO) organization provides a global focus for risk management strategy and execution oversight, compliance with Citi Policies and Regulatory requirements, and drives strong risk management - for USPB Risk, Wealth Risk, Investment Risk and Legacy Franchises/Banking & International Retail Risk Management. In this role, you’re expected to: The Risk Reporting SVP role is responsible for global and independent risk reporting for both the USPB and Wealth Chief Risk Officer (CRO) and other key enterprise level risk reporting to the Board of Directors and Regulators. This highly visible SVP role is a senior position, responsible for providing timely analytics, measurements, and insights compliant with BCBS 239, applicable regulations, and Citi policies governing risk aggregations and reporting. The role supports department objectives related to Enterprise Data Use Case execution, Risk Digitization, Strategic Data Sourcing, and related Consent Order Transformation Programs. The SVP is expected to work closely with peers within USPB Risk and Wealth Risk CAO, the USPB and Wealth CRO, 1st & 2nd lines of defense (LOD) senior management, Product Heads and specialized subject matter experts (SMEs) in Enterprise Risk Management (ERM), Counterparty Credit Risk (CCR), Wholesale Credit Risk (WCR), Retail Credit Risk Management (RCR) and related Technology partners throughout Citi. Define and substantiate scope, identifying dependencies, and agreeing with stakeholders for Enterprise Data Use Case (UC) requirements.Document requirements for improving Retail Credit Risk, Wealth Risk, and Investment Risk data to support timely and effective Risk Management and OversightLead the strategy, approach and automation of reporting, measurements, and analytics for all risk reports supporting the USPB and Wealth Risk CROs, Regulators, and Risk management.Work with the various project teams to ensure key milestones are achieved for each phase of the project including requirement documentation, UAT, production parallel and sustainability.Regularly and effectively communicate with senior stakeholders, both verbally and written, the strategic vision of target state risk management strategy, as well as progress of path to strong effortTimely, quality, and compliant risk reporting, measurements, and analyticsReporting rationalization and redesign to meet, leverage and align with path to strong transformation in progress – including revision of reports to adopt new/changing risk taxonomies and aggregation requirements, new systems of record and authorized data sources and new reporting/oversight infrastructure and BI/analytics tools, to deliver updated and new risk aggregations that meet changing organizational needs and regulatory/policy requirements.Work in close partnership with USPB Risk and Wealth Risk CAO peers, Risk Policy/Process owners and stakeholders across first and second line of defense to rationalize, simplify and digitize risk reporting.Design, coordinate, and prepare executive materials for USPB and Wealth CRO and management team’s senior presentations to the Board, risk committees, and regulators, including any ad-hoc materials.Partner with Independent Risk Management and In-Business/Country Risk Management to address new risk monitoring or regulatory requirements.Lead strategic initiatives to drive common & concurrent use of “gold source reports” by 1st & 2nd line and deliver faster time to insights.Enhance and streamline reporting processes by adopting best-in-class modern intelligence tools and improving data quality in partnership with stakeholders in risk management, technology, and business teams. As a successful candidate, you’d ideally have the following skills and exposure: Excellent communication skills are required to negotiate and interact with senior leadership and partner effectively with other reporting/tech/data leads across the firm. Strong data analysis skills are also required to ensure seamless and aligned transformation of USPB and Wealth Risk reporting, measurements and analytics to overall risk and data target state, including full adoption of new reporting and data management tools for compliance with Global Regulatory and Management Reporting Policy, Citi Data Governance Policy, End-User Computing remediation and BCBS 239 requirements 10+ years of relevant experience; Strong understanding of consumer credit risk, wholesale credit risk, and related dataTrack record of delivering complex projects related to data, aggregation, and reportingBachelor’s or Master’s degrees in business, finance, economics, computer science or other analytically intensive discipline (preferred)Ability to synthesize complex data/analytics into succinct and effective presentations. Prior leadership in risk analytics, reporting/BI (Tableau, Python, and SAS) and data preferred.Ability to multi-task effectively in a dynamic, high-volume, and complex environment with a practical solutions-driven approachExcellent verbal and written communication skills, with a proven track record of engagement with senior leadership teamsStrong interpersonal skills including influencing, facilitation, and partnering skills, able to leverage relationships and work collaboratively across an organization.Effective negotiation skills, a proactive and 'no surprises' approach in communicating issues and strength in sustaining independent views.Comfortable acting as an agent for positive change with agility and flexibility. Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you’ll have the opportunity to grow your career, give back to your community and make a real impact. Take the next step in your career, apply for this role at Citi today https://jobs.citi.com/dei ------------------------------------------------------ Job Family Group: Risk Management ------------------------------------------------------ Job Family: Risk Reporting ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View the "EEO is the Law" poster. View the EEO is the Law Supplement. View the EEO Policy Statement. View the Pay Transparency Posting
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2