Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Services Delivery Manager at Snowflake, you will play a crucial role in shaping the future of the AI Data Cloud. Your primary responsibility will be to engage at the account level, providing visibility, service reviews, and reporting throughout all stages of the implementation lifecycle. Working closely with senior stakeholders, you will identify opportunities for service and delivery improvements, ensuring that customer applications are well-designed and scalable to meet their business needs. Your success will be measured by your ability to develop a quality service strategy for each customer, capture key metrics, identify upsell opportunities, and establish yourself as the clients" trusted advisor. In this role, you will have the following responsibilities: - Own, manage, and maintain the operational relationship with assigned clients. - Transition seamlessly from Services sales to project delivery through effective handover processes. - Demonstrate a keen focus on details and operational rigor. - Manage multiple parallel projects and contribute to sales cycles. - Communicate effectively with project team members, management, and stakeholders on project status, issues, risks, and objectives. - Lead internal and client meetings, ensuring thorough documentation and rigorous follow-up. - Manage client and project documentation, communications, meetings, and necessary follow-ups. - Track metrics related to infrastructure performance and service requests. - Continuously seek opportunities for improvement. - Prepare and present service reports to clients. - Develop, implement, and monitor relationship roadmaps. - Contribute to the overall vision for service delivery and client satisfaction. - Identify areas for process improvements within both client organizations and Snowflake. - Collaborate with key stakeholders to implement and document necessary changes. The ideal candidate for this role will possess: - 8+ years of experience in a Services Delivery role or similar capacity involving complex, technical implementation projects. - Strong planning and organizational skills with the ability to oversee multiple projects while maintaining high standards. - Excellent communication and client-facing skills. - A strategic mindset focused on operational rigor and execution. - Expertise in monitoring and enhancing service delivery processes and performance metrics. - Direct experience in Data Warehousing, Business Intelligence, and/or Cloud technologies. - Proven ability to communicate effectively across various groups, from design and engineering to marketing, advertising, and business development. - Bonus Points: Experience in a client support and advisory technical role, such as Solution Architect, System Administrator, Technical Account Manager, or equivalent. Joining the Snowflake Professional Services team offers a unique opportunity to work with cutting-edge data warehouse technology, lead transformative initiatives, and collaborate with a dedicated team of professionals. As Snowflake continues to grow rapidly, we are seeking individuals who align with our values, challenge conventional thinking, drive innovation, and contribute to both their own future and the future of Snowflake. If you are ready to make a significant impact, we invite you to explore career opportunities on the Snowflake Careers Site for detailed salary and benefits information within the United States.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a talented and detail-oriented Business Intelligence (BI) Developer with a focus on developing and creating visually appealing dashboards. Your role is crucial in translating complex data sets into user-friendly visualizations to highlight key insights and trends. You will design, develop, and maintain interactive dashboards using Power BI, working with large datasets to extract, clean, and transform data for consumption. Collaborating with business stakeholders, you will understand their data needs and translate them into dashboard requirements. Regular feedback sessions with end-users will ensure that the dashboards meet business needs effectively. Your responsibilities also include optimizing dashboards for performance and usability, updating them regularly to reflect the latest data and business metrics. To qualify for this role, you should have at least 5 years of experience as a BI Developer, focusing on dashboard development. Proficiency in Power BI, strong SQL skills, and experience with database management systems are essential. You should possess excellent data visualization skills, experience with ETL processes and tools, and familiarity with data warehousing concepts and cloud platforms. Knowledge of programming languages like Python or R, understanding of data governance and security best practices, and the ability to translate business requirements into effective dashboards are also required. Strong analytical, problem-solving, communication, and collaboration skills are crucial for success in this role. A Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field is necessary for consideration.,
Posted 3 days ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineering Lead/Architect with over 10 years of experience, you will play a crucial role in architecting and designing data solutions that meet business requirements efficiently. Collaborating with cross-functional teams, you will define data architectures, models, and integration strategies to ensure the successful implementation of data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be essential in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations to support reporting and analytics needs effectively. Utilizing Azure cloud services and Databricks platforms, you will manage and process large datasets efficiently. Your responsibilities will include building, deploying, and maintaining data pipelines on Azure Data Factory, Azure Databricks, and other Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be a key focus area. You will also create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in SQL and PL/SQL will be vital in writing complex queries to extract, transform, and load data effectively. You will optimize SQL queries and database performance for high-volume data processing to ensure seamless operations. Continuously monitoring and enhancing the performance of data pipelines and storage systems will be part of your responsibilities. You will troubleshoot and resolve data-related issues promptly to minimize downtime and maintain data availability. Documenting data engineering processes, data flows, and architectural decisions will be crucial for effective collaboration with data scientists, analysts, and stakeholders. Additionally, implementing data security measures and adhering to compliance standards like GDPR and HIPAA will be essential to protect sensitive data. In addition to your technical skills, you are expected to showcase leadership abilities by driving data engineering strategies, engaging in sales and proposal activities, developing strong customer relationships, and mentoring other team members. Your experience with cloud-based data solution architectures, client engagement, and leading technical teams will be valuable assets in this role. To qualify for this position, you should hold a bachelor's or master's degree in computer science or a related field. You must have over 10 years of experience in Data Engineering, with a strong focus on architecture. Proven expertise in Snowflake, Azure, and Databricks technologies, along with comprehensive knowledge of data warehousing concepts, ETL processes, and data integration techniques, is required. Exceptional SQL and PL/SQL skills, experience with performance tuning, and strong problem-solving abilities are essential. Excellent communication skills and relevant certifications in technologies like Snowflake and Azure will be advantageous.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a meticulous and analytical Financial Data Analyst to join our finance team. As a Financial Data Analyst, you will be responsible for managing and analyzing large datasets to provide crucial financial insights and support strategic decision-making. The ideal candidate will excel in handling and analyzing large datasets, utilizing their financial expertise to drive insights and facilitate strategic decisions. Your role will be vital in ensuring data integrity, implementing best practices, and offering actionable recommendations to aid our company in achieving its financial objectives. To qualify for this position, you should possess a Bachelor's degree in Finance, Accounting, Data Science, Statistics, Computer Science, or a related field. An advanced degree or relevant certifications such as CFA or CPA would be advantageous. Additionally, a minimum of 5 years of experience in data management, financial analysis, or a related role is required, with proven expertise in managing large datasets and financial modeling. The role demands proficiency in data management tools like SQL, ETL processes, and data warehousing, along with advanced knowledge of financial software and systems such as ERP and BI tools like Tableau and Power BI. Strong skills in data analysis and statistical methods are essential, as well as excellent problem-solving abilities to interpret complex data and make informed decisions. Effective communication skills, both verbal and written, are crucial for presenting complex information clearly and concisely. Attention to detail is paramount, ensuring a high level of accuracy in data analysis and financial reporting. In this position, your key responsibilities will include managing and analyzing large financial datasets, developing and maintaining financial models, analyzing financial data to identify trends, patterns, and anomalies, and providing actionable insights to stakeholders. You will apply financial acumen to analyze complex datasets, create and maintain dashboards and visualizations, prepare detailed financial reports, forecasts, and budgets, and collaborate with finance and accounting teams to ensure data consistency and alignment with financial goals. Furthermore, you will be responsible for creating and maintaining comprehensive documentation of data processes, analysis methodologies, and financial models, collaborating with cross-functional teams to understand data needs, providing data-driven recommendations to support business strategies, identifying opportunities for process improvements, and automating tasks to enhance data management and analysis efficiency. Join us for exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. Enjoy a collaborative environment where you can expand your skills and maintain a healthy work-life balance with flexible schedules and opportunities for professional development. We offer competitive salaries, various benefits, and fun perks to create a vibrant and rewarding workplace. Come be a part of GlobalLogic, a leader in digital engineering, and help build innovative products and digital experiences for global brands across diverse industries.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
chandigarh
On-site
As a Senior Data Engineer, you will play a crucial role in supporting the Global BI team for Isolation Valves as they transition to Microsoft Fabric. Your primary responsibilities will involve data gathering, modeling, integration, and database design to facilitate efficient data management. You will be tasked with developing and optimizing scalable data models to cater to analytical and reporting needs, utilizing Microsoft Fabric and Azure technologies for high-performance data processing. Your duties will include collaborating with cross-functional teams such as data analysts, data scientists, and business collaborators to comprehend their data requirements and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to back Power BI and automation initiatives. Additionally, your expertise in data modeling, particularly in data warehouse and lakehouse design, will be essential in designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Furthermore, you will be responsible for developing ETL processes using tools like SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar platforms to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also fall under your purview. You will supervise and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Your role will require a strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms, along with experience in data integration and ETL tools like Azure Data Factory. A deep understanding of Microsoft Fabric or similar data platforms, as well as comprehensive knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions, will be necessary. Effective communication skills to convey technical concepts to both technical and non-technical stakeholders, the ability to work both independently and within a team environment, and the willingness to stay abreast of new technologies and business areas are also vital for success in this role. To excel in this position, you should possess 5-7 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities to tackle complex data challenges, and proficiency in database management, SQL query optimization, and data mapping. A solid grasp of Excel, including formulas, filters, macros, pivots, and related operations, is essential. Proficiency in Python and SQL/Advanced SQL for data transformations/Debugging, along with a willingness to work flexible hours based on project requirements, is also required. Furthermore, hands-on experience with Fabric components such as Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Models, as well as advanced SQL skills and experience with complex queries, data modeling, and performance tuning, are highly desired. Prior exposure to implementing Medallion Architecture for data processing, experience in a manufacturing environment, and familiarity with Oracle, SAP, or other ERP systems will be advantageous. A Bachelor's degree or equivalent experience in a Science-related field, with good interpersonal skills in English (spoken and written) and Agile certification, will set you apart as a strong candidate for this role. At Emerson, we are committed to fostering a workplace where every employee is valued, respected, and empowered to grow. Our culture encourages innovation, collaboration, and diverse perspectives, recognizing that great ideas stem from great teams. We invest in your ongoing career development, offering mentorship, training, and leadership opportunities to ensure your success and make a lasting impact. Employee wellbeing is a priority for us, and we provide competitive benefits plans, medical insurance options, Employee Assistance Program, flexible time off, and other supportive resources to help you thrive. Emerson is a global leader in automation technology and software, dedicated to helping customers in critical industries operate more sustainably and efficiently. Our commitment to our people, communities, and the planet drives us to create positive impacts through innovation, collaboration, and diversity. If you seek an environment where you can contribute to meaningful work, develop your skills, and make a difference, join us at Emerson. Let's go together towards a brighter future.,
Posted 3 days ago
13.0 - 17.0 years
0 Lacs
maharashtra
On-site
Birlasoft is a powerhouse that brings together domain expertise, enterprise solutions, and digital technologies to redefine business processes. With a consultative and design thinking approach, we drive societal progress by enabling our customers to run businesses with efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, we have a team of 12,500+ professionals dedicated to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our commitment to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. As an Azure Tech PM at Birlasoft, you will be responsible for leading and delivering complex data analytics projects. With 13-15 years of experience, you will play a critical role in overseeing the planning, execution, and successful delivery of data analytics initiatives, while managing a team of 15+ skilled resources. You should have exceptional communication skills, a deep understanding of Agile methodologies, and a strong background in managing cross-functional teams in data analytics projects. Key Responsibilities: - Lead end-to-end planning, coordination, and execution of data analytics projects, ensuring adherence to project scope, timelines, and quality standards. - Guide the team in defining project requirements, objectives, and success criteria using your extensive experience in data analytics. - Apply Agile methodologies to create and maintain detailed project plans, sprint schedules, and resource allocation for efficient project delivery. - Manage a team of 15+ technical resources, fostering collaboration and a culture of continuous improvement. - Collaborate closely with cross-functional stakeholders to align project goals with business objectives. - Monitor project progress, identify risks, issues, and bottlenecks, and implement mitigation strategies. - Provide regular project updates to executive leadership, stakeholders, and project teams using excellent communication skills. - Facilitate daily stand-ups, sprint planning, backlog grooming, and retrospective meetings to promote transparency and efficiency. - Drive the implementation of best practices for data analytics, ensuring data quality, accuracy, and compliance with industry standards. - Act as a point of escalation for project-related challenges and work with the team to resolve issues promptly. - Collaborate with cross-functional teams to ensure successful project delivery, including testing, deployment, and documentation. - Provide input to project estimation, resource planning, and risk management activities. Mandatory Experience: - Technical Project Manager experience of minimum 5+ years in Data lake and Data warehousing (DW). - Strong understanding of DW process execution from acquiring data to visualization. - Exposure to Azure skills such as Azure ADF, Azure Databricks, Synapse, SQL, PowerBI for minimum 3+ years or experience in managing at least 2 end-to-end Azure Cloud projects. Other Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or related field. - 13-15 years of progressive experience in technical project management focusing on data analytics and data-driven initiatives. - In-depth knowledge of data analytics concepts, tools, and technologies. - Exceptional leadership, team management, interpersonal, and communication skills. - Demonstrated success in delivering data analytics projects on time, within scope, and meeting quality expectations. - Strong problem-solving skills and proactive attitude towards identifying challenges. - Project management certifications such as PMP, PMI-ACP, CSM would be an added advantage. - Ability to thrive in a dynamic and fast-paced environment, managing multiple projects simultaneously.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
About Us: JP Morgan is a dynamic and innovative company dedicated to providing top-tier solutions to our clients. We pride ourselves on our collaborative culture, cutting-edge technology, and commitment to excellence. We are currently seeking a highly skilled and motivated Solutions Analyst to join our team. As a Solutions Analyst, you will play a pivotal role in designing, developing, and implementing data-driven solutions that meet the needs of our clients. You will leverage your expertise in dashboard development and SQL to create insightful and actionable business intelligence tools. This role requires a strategic thinker with strong analytical skills and the ability to lead projects from conception to completion. Key Responsibilities: Lead the design, development, and deployment of dashboards and reports to provide actionable insights to stakeholders. Understand end-to-end process and rate of delivery for overall solutions. Ensure current priorities are in line with expectations of key stakeholders. Develop and optimize SQL queries to extract, transform, and load data from various sources. Ensure data accuracy, integrity, and security across all data/reporting solutions. Conduct thorough testing and validation of dashboards and reports to ensure high-quality deliverables. Collaborate with business users to gather requirements and translate them into technical specifications. Stay current with industry trends and best practices in business intelligence and data analytics. Qualifications: Bachelors degree in Computer Science, Information Systems, Data Analytics, or a related field. Minimum of 3 years of experience in dashboard development preferably with Qlik Sense. Proficiency in SQL and experience with database management systems and writing complex queries. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with data visualization tools and techniques. Knowledge of data warehousing concepts and ETL processes is a plus. Preferred knowledge of ITIL based practices and associated data (Incident, Change, Problem, Resiliency, Capacity Management). Ability to manage multiple in-flight responsibilities and a high volume of detailed work effectively. Preferred UX experience and Qlik Mashup (or equivalent) experience. Project management experience is a plus. Experience in any scripting language Python, Java script or front-end development will be a plus.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
siliguri, west bengal
On-site
The role of a Data Analyst is crucial in utilizing data to drive business decisions and strategies. You are responsible for collecting, processing, and analyzing data to provide actionable insights that can help improve organizational processes, products, and services. You play a pivotal role in aiding businesses to make informed decisions by interpreting complex data sets and identifying trends. Key responsibilities include collecting and interpreting data from various sources, cleaning and transforming data for analysis, identifying patterns and trends in data sets, developing and maintaining databases, creating visualizations and reports to communicate findings, conducting statistical analysis to support business decisions, collaborating with cross-functional teams to understand data needs, using statistical and data analysis tools to interpret data, developing and implementing data analysis strategies, identifying and recommending process improvements based on data insights, presenting findings to stakeholders and business leaders, ensuring data accuracy and integrity, keeping abreast of industry best practices and technological advancements, assisting in the development of data-driven strategies, and supporting data-driven decision-making processes. Required Qualifications: - Bachelors or Masters degree in Computer Science, Statistics, Economics, Mathematics, or related field - Proven experience as a Data Analyst or related role - Proficiency in SQL for querying and data manipulation - Strong knowledge of data analysis and visualization tools such as Python, R, Tableau, or Power BI - Ability to interpret and analyze complex data from multiple sources - Solid understanding of statistical methods and their applications - Experience in data cleaning and transformation techniques - Excellent analytical and problem-solving skills - Ability to communicate complex findings in a clear and understandable manner - Knowledge of data warehousing and database management systems - Experience in conducting root cause analysis and process improvement - Proven track record of delivering actionable insights from data - Ability to work independently and in cross-functional teams - Strong attention to detail and accuracy - Certifications in data analysis or related fields is a plus Skills required for this role include python, tableau, communication, problem-solving, data cleaning, root cause analysis, data analysis, statistics, SQL, R, data transformation, Power BI, database management, data warehousing, statistical analysis, and data visualization.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The position is for an Officer / Assistance Manager based in Mumbai. The ideal candidate should have a qualification of B.E. / MCA / B.Tech / M.sc (I.T.) and an age limit between 25-30 years. You should have a minimum of 2-3 years of ETL development experience with a strong knowledge of ETL ideas, tools, and data structures. It is essential to have the capability to analyze and troubleshoot complex data sets and determine data storage needs. Familiarity with data warehousing concepts to build a data warehouse for internal departments of the organization is required. Your responsibilities will include creating and enhancing data solutions to enable seamless delivery of data, collecting, parsing, managing, and analyzing large sets of data. You will lead the design of the logical data model, implement the physical database structure, and construct and implement operational data stores and data marts. Designing, developing, automating, and supporting complex applications to extract, transform, and load data will be part of your role. You must ensure data quality at the time of ETL, develop logical and physical data flow models for ETL applications, and have advanced knowledge of SQL, Oracle, SQOOP, NIFI tools commands, and queries. Current CTC and Expected CTC should be clearly mentioned. To apply, please email your resume to careers@cdslindia.com with the position applied for in the subject column.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior ETL Developer in the Data Services Team, you will play a lead role in ETL design, data modeling, and ETL development. Your responsibilities will include facilitating best practice guidelines, providing technical leadership, working with stakeholders to translate requirements into solutions, gaining approval for designs and effort estimates, and documenting work via Functional and Tech Specs. You will also be involved in analyzing processes for gaps and weaknesses, preparing roadmaps and migration plans, and communicating progress using the Agile Methodology. To excel in this role, you should have at least 5 years of experience with Oracle, Data Warehousing, and Data Modeling. Additionally, you should have 4 years of experience with ODI or Informatica IDMC, 3 years of experience with Databricks Lakehouse and/or Delta tables, and 2 years of experience in designing, implementing, and supporting a Kimball method data warehouse on SQL Server or Oracle. Strong SQL skills, a background in Data Integration, Data Security, and Enterprise Data Warehouse development, as well as experience in Change Management, Release Management, and Source Code control practices are also required. The ideal candidate will have a high school diploma or equivalent, with a preference for a Bachelor of Arts or a Bachelor of Science degree in computer science, systems analysis, or a related area. If you are enthusiastic about leveraging your ETL expertise to drive digital modernization and enhance data services, we encourage you to apply for this role and be part of our dynamic team.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a skilled and motivated Data Engineer with at least 4 years of experience in GCP, Teradata, and Data Warehousing. The ideal candidate should have hands-on expertise in developing robust data engineering solutions on Google Cloud Platform (GCP) and working experience with Teradata. You must be proficient in designing and automating scalable data pipelines and possess excellent leadership, communication, and collaboration skills. Your responsibilities will include analyzing source systems, profiling data, and resolving data quality issues. You will be required to gather and comprehend business requirements for data transformation, design, develop, test, and deploy ETL/data pipelines using GCP services and Airflow. Additionally, writing complex SQL queries for data extraction, formatting, and analysis, creating and maintaining Source to Target Mapping, and designing documentation will be part of your role. You will also need to build metadata-driven frameworks for scalable data pipelines, perform unit testing, and document results, utilize DevOps tools for version control and deployment, provide production support, enhancements, and bug fixes, troubleshoot issues, and support ad-hoc business requests. Collaboration with stakeholders to resolve EDW incidents, manage expectations, apply ITIL concepts for incident and problem management, perform data cleaning, transformation, and validation, and stay updated on GCP advancements and industry best practices are also key responsibilities. Requirements: - Minimum 4 years of experience in ETL and Data Warehousing - Hands-on experience with GCP services such as BigQuery, Dataflow, Cloud Storage, etc. - Experience in Apache Airflow for workflow orchestration - Experience in automating ETL solutions - Experience in executing at least 2 GCP Cloud Data Warehousing projects - Exposure to Agile/SAFe methodologies in at least 2 projects - Mid-level proficiency in PySpark and Teradata - Strong SQL skills and experience working with semi-structured data formats like JSON, Parquet, XML - Experience with DevOps tools like GitHub, Jenkins, or similar - Deep understanding of Data Warehousing concepts, data profiling, quality, and mapping Preferred Qualifications: - B.Tech/B.E. in Computer Science or a related field - Google Cloud Professional Data Engineer Certification - Strong leadership and communication skills ,
Posted 3 days ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
You will play a crucial role as a Data Engineer, leading the development of data infrastructure at the forefront. Your responsibilities will involve creating and maintaining systems that ensure a seamless flow, availability, and reliability of data. Your key tasks at Coforge will include: - Developing and managing data pipelines to facilitate efficient data extraction, transformation, and loading (ETL) processes. - Designing and enhancing data storage solutions such as data warehouses and data lakes. - Ensuring data quality and integrity by implementing data validation, cleansing, and error handling mechanisms. - Collaborating with data analysts, data architects, and software engineers to comprehend data requirements and provide relevant data sets for business intelligence purposes. - Automating and enhancing data processes and workflows to drive scalability and efficiency. - Staying updated on industry trends and emerging technologies in the field of data engineering. - Documenting data pipelines, processes, and best practices to facilitate knowledge sharing. - Contributing to data governance and compliance initiatives to adhere to regulatory standards. - Working closely with cross-functional teams to promote data-driven decision-making across the organization. Key skills required for this role: - Proficiency in data modeling and database management. - Strong programming capabilities, particularly in Python, SQL, and PL/SQL. - Sound knowledge of Airflow, Snowflake, and DBT. - Hands-on experience with ETL (Extract, Transform, Load) processes. - Familiarity with data warehousing and cloud platforms, especially Azure. Your experience of 5-10 years will be instrumental in successfully fulfilling the responsibilities of this role located in Greater Noida with a shift timing from 2:00 PM IST to 10:30 PM IST.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
kochi, kerala
On-site
As a skilled professional in ETL testing and data warehousing, your primary responsibility will be to design and execute test plans, test cases, and test scripts for ETL processes. You will be tasked with performing data validation and verification to ensure data integrity and accuracy. It will also be your duty to identify, document, and track defects and issues in the ETL processes, collaborating closely with data engineers and developers to troubleshoot and resolve data-related issues. Your role will also involve participating in requirement analysis and providing valuable feedback on data quality and testing requirements. Additionally, you will be expected to generate and maintain test documentation and reports to ensure comprehensive and accurate records. To excel in this position, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field. You should have 4-6 years of experience specifically in ETL testing and data warehousing, with a strong knowledge of ETL tools and processes. Proficiency in SQL and database management systems is essential, along with familiarity with data modeling and data architecture concepts. If you are passionate about ensuring the quality and accuracy of data through meticulous testing processes, and possess the relevant qualifications and experience, we encourage you to apply for this challenging and rewarding opportunity.,
Posted 3 days ago
4.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
As a skilled professional in SQL development and ETL processes, you will be responsible for designing and implementing ETL processes to extract, transform, and load data from diverse sources into data warehouses. Your role will involve developing and optimizing SQL queries for efficient data retrieval and reporting. Collaboration with data analysts and business stakeholders is essential to comprehend data requirements and provide effective solutions. In this position, you will play a key role in monitoring and troubleshooting ETL processes to ensure data integrity and optimal performance. Your tasks will include creating and maintaining documentation for data processes and workflows. Proficiency in SQL and experience with ETL tools are crucial for success in this role, along with a solid background in data warehousing concepts and practices. Utilizing tools such as Power BI for creating interactive dashboards and reports to visualize data will be part of your responsibilities. The ideal candidate for this position should have 4-12 years of experience in SQL development and ETL processes, familiarity with Power BI or similar data visualization tools, and possess strong analytical and problem-solving skills. If you are looking for a challenging opportunity where you can leverage your expertise in SQL, ETL processes, and data visualization tools to drive impactful business outcomes, this role might be the perfect fit for you.,
Posted 3 days ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
As the Director of Data Engineering, you will play a strategic leadership role in overseeing the architecture, development, and maintenance of our company's data infrastructure. Your responsibilities will include leading a team of data engineers to design, build, and scale data systems and processes to ensure data quality, accessibility, and reliability. Collaboration with data scientists, analysts, and other stakeholders will be crucial to drive data-driven decision-making across the organization. You will lead and manage a team of 50+ members, including architects and engineers, to ensure high performance and engagement. Designing and implementing end-to-end Azure solutions, maintaining data architectures, and collaborating with stakeholders to translate business requirements into scalable cloud solutions are key aspects of your role. Your responsibilities will also involve overseeing the development and deployment of data solutions using Azure services such as ADF, Event Hubs, Stream Analytics, Synapse Analytics, Azure Data Bricks, Azure SQL Database, and Azure DevOps. Ensuring data governance, security, and compliance across all data solutions, collaborating with various team members, and driving continuous improvement and innovation within the engineering team are essential parts of your role. In terms of client account management, you will build and maintain strong client relationships by understanding their unique needs and challenges. Acting as the main point of contact for clients, developing account plans, and identifying growth opportunities are also part of your responsibilities. To be successful in this role, you should have a minimum of 15+ years of experience in data engineering or related roles, including at least 5 years in a leadership position. A degree in Computer Science, Information Technology, Data Science, or a related field is required. Key technical skills for this role include expertise in Cloud/Data Solution Design, strong experience with Azure Cloud technologies, proficiency in data engineering technologies and tools, programming experience in Java, Python, PySpark, and knowledge of data governance, security, and compliance standards. Leadership skills such as leading high-performing teams, project management, communication, and interpersonal skills are also essential. Your competencies should include strategic thinking, problem-solving skills, the ability to work in a fast-paced environment, strong organizational skills, and a drive for innovation and continuous improvement.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for developing ETL/ELT pipelines using Python and working with Snowflake for data transformation and modeling. Your role will involve writing efficient SQL queries within the Snowflake environment and integrating data from various sources to ensure data quality. Collaboration with data engineers and analysts on scalable solutions will also be a key aspect of your responsibilities. Strong programming skills in Python, hands-on experience with Snowflake, a solid understanding of SQL and data warehousing concepts, as well as familiarity with cloud platforms (AWS/GCP/Azure is a plus) are expected from you as a Python+Snowflake Developer.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
You are a global climate technologies company engineered for sustainability, creating sustainable and efficient residential, commercial, and industrial spaces through HVACR technologies. Your focus includes protecting temperature-sensitive goods throughout the cold chain and providing comfort worldwide. By combining best-in-class engineering, design, and manufacturing with leading brands in compression, controls, software, and monitoring solutions, you develop next-generation climate technology tailored for future needs. Whether you are a professional seeking a career change, an undergraduate student exploring opportunities, or a recent graduate with an advanced degree, numerous opportunities await you to innovate, be challenged, and make a significant impact by joining the team and embarking on your journey today. In the realm of Software Development, you will be responsible for developing code and solutions that facilitate the transfer and transformation of data across various systems. Maintaining a deep technical knowledge of tools in the data warehouse, data hub, and analytical tools is crucial. Ensuring efficient data transformation and storage for retrieval and usage, as well as optimizing data systems" performance, are key tasks. Moreover, developing a profound understanding of underlying business systems related to analytical systems is essential. You will adhere to standard software development lifecycle, code control, code standards, and process standards, continually enhancing your technical knowledge through self-training, educational opportunities, and participation in professional organizations related to your tech skills. In Systems Analysis, your role involves collaborating with key stakeholders to comprehend business needs and capture functional and technical requirements. You will propose ideas to simplify solution designs and communicate expectations to stakeholders and resources during solution delivery. Developing and executing test plans to ensure the successful rollout of solutions, including data accuracy and quality, is part of your responsibilities. Regarding Service Management, effective communication with leaders and stakeholders to address obstacles during solution delivery is imperative. Defining and managing promised delivery dates, proactively researching, analyzing, and predicting operational issues, and offering viable options to resolve unexpected challenges during solution development and delivery are essential aspects of your role. Your education and job-related technical skills include a Bachelor's Degree in Computer Science/Information Technology or equivalent. You possess the ability to communicate effectively with individuals at all levels verbally and in writing, demonstrating a courteous, tactful, and professional approach. Working in a large, global corporate structure, having an advanced English level (additional language proficiency is advantageous), a strong sense of ethics and adherence to the company's core values, and willingness to travel domestically and internationally to support global implementations are required. You demonstrate the capability to clearly identify and define problems, assess alternative solutions, and make timely decisions. Your decision-making ability, operational efficiency in ambiguous situations, high analytical skills to evaluate approaches against objectives, and a minimum of three years of experience in a Data Engineer role with expertise in specific tools and technologies are essential. Your behavior and soft skills encompass proficiency in written technical concepts, leading problem-solving teams, conflict resolution efficiency, collaboration in cross-functional projects, and driving process mapping sessions. Additionally, the Korn Ferry Competencies you embody include customer focus, building networks, instilling trust, being tech-savvy, demonstrating interpersonal savvy, self-awareness, taking action, collaborating, and being a nimble learner. The company's commitment to its people is evident in its dedication to sustainability, reducing carbon emissions, and improving energy efficiency through groundbreaking innovations, HVACR technology, and cold chain solutions. The culture of passion, openness, and collaboration empowers employees to work towards a common goal of making the world a better place. Investing in the comprehensive development of individuals ensures personal and professional growth from onboarding through senior leadership. Flexible and competitive benefits plans cater to individual and family needs, offering various options for time off, including paid parental leave, vacation, and holiday leave. The commitment to Diversity, Equity & Inclusion at Copeland emphasizes the creation of a diverse, equitable, and inclusive environment essential for organizational success. A culture where every employee is welcomed, heard, respected, and valued for their experiences, ideas, perspectives, and expertise is fostered. Embracing diversity and inclusion drives innovation, enhances customer service, and creates a positive impact in the communities where the company operates. Copeland is an Equal Opportunity Employer, fostering an inclusive workplace where all individuals are valued and respected for their contributions and unique qualities.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
The Business Analyst position at Piramal Critical Care (PCC) within the IT department in Kurla, Mumbai involves acting as a liaison between PCC system users, software support vendors, and internal IT support teams. The ideal candidate is expected to be a technical contributor and advisor to PCC business users, assisting in defining strategic application development and integration to support business processes effectively. Key stakeholders for this role include internal teams such as Supply Chain, Finance, Infrastructure, PPL Corporate, and Quality, as well as external stakeholders like the MS Support team, 3PLs, and Consultants. The Business Analyst will report to the Chief Manager- IT Business Partner. The ideal candidate should hold a B.S. in Information Technology, Computer Science, or equivalent, with 8-10 years of experience in Data warehousing, BI, Analytics, and ETL tools. Experience in the Pharmaceutical or Medical Device industry is required, along with familiarity with large global Reporting tools like Qlik/Power BI, SQL, Microsoft Power Platform, and other related platforms. Knowledge of computer system validation lifecycle, project management tools, and office tools is also essential. Key responsibilities of the Business Analyst role include defining user and technical requirements, leading implementation of Data Warehousing, Analytics, and ETL systems, managing vendor project teams, maintaining partnerships with business teams, and proposing IT budgets. The candidate will collaborate with IT and business teams, manage ongoing business applications, ensure system security, and present project updates to the IT Steering committee. The successful candidate must possess excellent interpersonal and communication skills, self-motivation, proactive customer service attitude, leadership abilities, and a strong service focus. They should be capable of effectively communicating business needs to technology teams, managing stakeholder expectations, and working collaboratively to achieve results. Piramal Critical Care (PCC) is a subsidiary of Piramal Pharma Limited (PPL) and is a global player in hospital generics, particularly Inhaled Anaesthetics. PCC is committed to delivering critical care solutions globally and maintaining sustainable growth for stakeholders. With a wide presence across the USA, Europe, and over 100 countries, PCC's product portfolio includes Inhalation Anaesthetics and Intrathecal Baclofen therapy. PCC's workforce comprises over 400 employees across 16 countries and is dedicated to expanding its global footprint through new product additions in critical care. Committed to corporate social responsibility, PCC collaborates with partner organizations to provide hope and resources to those in need while caring for the environment.,
Posted 3 days ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Azure Databricks Engineer who will be responsible for designing, developing, and maintaining scalable data pipelines and supporting data infrastructure in an Azure cloud environment. Your key responsibilities will include designing ETL pipelines using Azure Databricks, building robust data architectures on Azure, collaborating with stakeholders to define data requirements, optimizing data pipelines for performance and reliability, implementing data transformations and cleansing processes, managing Databricks clusters, and leveraging Azure services for data orchestration and storage. You must possess 5-10 years of experience in data engineering or a related field with extensive hands-on experience in Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services such as Azure Data Lake, Data Factory, Azure SQL, and Azure Synapse Analytics is required. Experience with Python, Scala, or SQL for data manipulation, ETL frameworks, Delta Lake, Parquet formats, Azure DevOps, CI/CD pipelines, big data architecture, and distributed systems is essential. Knowledge of data modeling, performance tuning, and optimization of big data solutions is expected, along with problem-solving skills and the ability to work in a collaborative environment. Preferred qualifications include experience with real-time data streaming tools, Azure certifications, machine learning frameworks, integration with Databricks, and data visualization tools like Power BI. A bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field is required for this role.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The Data Visualization, Business Analytics role is vital for the organization as it involves transforming intricate data into visual insights for key stakeholders, facilitating informed decision-making and strategic planning. You will collaborate with business leaders to recognize and prioritize data visualization requirements. Your responsibilities will include designing interactive dashboards and reports to illustrate essential business metrics and trends. You will create visually appealing charts, graphs, and presentations that are easily understandable. Furthermore, it is essential to develop and uphold data visualization best practices and standards. As part of the role, you will utilize various data visualization tools and platforms to present insights effectively. Conducting data analysis to identify patterns and trends for visualization purposes will be a key task. Implementing user interface (UI) and user experience (UX) principles to enhance visualization is crucial. Providing training and support to team members on data visualization techniques is also part of the responsibilities. Additionally, you will be responsible for performing ad-hoc analysis and data mining to support business needs. Collaboration with data engineers and data scientists to ensure data accuracy and integrity is essential. It is important to stay updated with industry trends and best practices in data visualization and business analytics. Presenting findings and insights to key stakeholders in a clear and compelling manner will be a regular task. Communication with cross-functional teams to understand data requirements is vital. You will contribute to the continuous improvement of data visualization processes and techniques. The role requires a Bachelor's degree in Data Science, Business Analytics, Computer Science, or a related field. Proven experience in data visualization, business intelligence, or related roles is necessary. Proficiency in data visualization tools like Tableau, Power BI, or D3.js is essential. Strong analytical and problem-solving skills are required. Expertise in SQL for data querying and manipulation is a must. An understanding of statistical concepts and data modeling is crucial. Excellent communication and presentation skills are necessary. The ability to work effectively in a fast-paced and dynamic environment is essential. Knowledge of business operations and strategic planning is required. Experience in interpreting and analyzing complex datasets is beneficial. Familiarity with data warehousing and ETL processes is a plus. Managing multiple projects and deadlines simultaneously, being detail-oriented with a focus on data accuracy and quality, working collaboratively in a team setting, and possessing strong business acumen and understanding of key performance indicators are important skills for this role.,
Posted 4 days ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
A career within Financial Markets Business Advisory services will provide you with the opportunity to contribute to a variety of audit, regulatory, valuation, and financial analyses services to design solutions that address clients" complex accounting and financial reporting challenges, as well as their broader business issues. To really stand out and make fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. The PwC Professional, our global leadership development framework, gives us a single set of expectations across our lines, geographies, and career paths. It provides transparency on the skills required as individuals to be successful and progress in our careers, now and in the future. Responsibilities As an Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Invite and give in the moment feedback in a constructive manner. - Share and collaborate effectively with others. - Identify and make suggestions for improvements when problems and/or opportunities arise. - Handle, manipulate, and analyze data and information responsibly. - Follow risk management and compliance procedures. - Keep up-to-date with developments in the area of specialism. - Communicate confidently in a clear, concise, and articulate manner - verbally and in the materials produced. - Build and maintain an internal and external network. - Seek opportunities to learn about how PwC works as a global network of firms. - Uphold the firm's code of ethics and business conduct. We are seeking a highly motivated Data Engineer - Associate to join our dynamic team. The ideal candidate will have a strong foundation in data engineering, particularly with Python and SQL, and exposure to cloud technologies and data visualization tools such as Power BI, Tableau, or QuickSight. The Data Engineer will work closely with data architects and business stakeholders to support the design and implementation of data pipelines and analytics solutions. This role offers an opportunity to grow technical expertise in cloud and data solutions, contributing to projects that drive business insights and innovation. Key Responsibilities Data Engineering: - Develop, optimize, and maintain data pipelines and workflows to ensure efficient data integration from multiple sources. - Use Python and SQL to design and implement scalable data processing solutions. - Ensure data quality and consistency throughout data transformation and storage processes. - Collaborate with data architects and senior engineers to build data solutions that meet business and technical requirements. Cloud Technologies - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data solutions. - Support the migration of on-premise data infrastructure to the cloud environment when needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data warehouses. Data Visualization - Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. - Collaborate with analysts to understand their data needs and optimize data structures for reporting. Collaboration And Support - Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. - Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. - Document processes, data flows, and infrastructure for team knowledge sharing. Required Skills And Experience - 0+ years of experience in data engineering, working with Python and SQL. - Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred. - Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus. - Basic understanding of data modeling, ETL processes, and data warehousing concepts. - Strong analytical and problem-solving skills, with attention to detail. Qualifications - Bachelor's degree in Computer Science, Data Science, Information Technology, or related fields. - Basic knowledge of cloud platforms and services is advantageous. - Strong communication skills and the ability to work in a team-oriented environment.,
Posted 4 days ago
3.0 - 6.0 years
9 - 19 Lacs
Hyderabad
Work from Office
About Protiviti India Protiviti India is a global business consulting firm dedicated to helping leaders confidently face the future. Operating in over 25 countries with more than 90 offices worldwide, we are supported by over 11,000 professionals globally and achieved a global revenue of $2.10 billion as of January 2024. As a wholly-owned subsidiary of Robert Half (NYSE: RHI), we deliver deep expertise across various solutions including Internal Audit & Financial Advisory, Technology & Digital, Financial Services - Risk, Business Performance Improvement, and Managed Business Services. With a strong and growing presence across major Indian cities, Protiviti India prides itself on being a genuinely independent firm with proven methodologies and experienced professionals. Our vision is to bring confidence in a dynamic world, guided by values like integrity, innovation, and collaboration. We also maintain a strong partnership with the Confederation of Indian Industry (CII). We are currently expanding our Enterprise Application Services (EAS) within the Technology & Digital solution area, where we are proud to be a diamond-level partner for SAP Labs and globally ranked as the 11th preferred partner for SAP, with an ambitious goal to reach the top 5. About the Role: SAP Datasphere Consultant We are seeking a proactive and skilled SAP Datasphere Consultant to join our team in Hyderabad . This role is ideal for a professional with 3-6 years of experience in SAP data and analytics, including a strong background in SAP BW or SAP BW/4HANA , and practical experience in at least 2-3 projects utilizing SAP Datasphere . The ideal candidate will possess a passion for data modeling and visualization, eager to translate complex business needs into impactful technical solutions. Key Responsibilities: Translate diverse business requirements into effective technical solutions using SAP Datasphere and/or BW/4HANA. Assist in the design and take full ownership of the delivery of integrated analytics solutions, optimizing data models, flows, and reporting. Leverage deep technical knowledge to solve complex business challenges and contribute to SAP Datasphere performance optimization. Work seamlessly within multi-resource projects and internal teams, collaborating effectively while also demonstrating the ability to work independently. Build and maintain strong client relationships, acting as a trusted advisor and handling complex situations with confidence. Identify opportunities for repeat business and support pre-sales activities for new engagements. Required Qualifications & Skills: 3-6 years of experience in SAP data and analytics projects, with a strong background in SAP BW or SAP BW/4HANA. Proven experience in at least 2-3 projects with SAP Datasphere . Proficient in SAP Datasphere and BW/4HANA architecture, data modeling, integration, and performance optimization . Excellent knowledge of one or more SAP S/4HANA or ECC functional modules , including business processes, data models, and structures. Strong skills in SQL ; experience with ABAP and automation within SAP environments is a plus. Experience with SAP Analytics Cloud, SAP S/4 Embedded Analytics, and Native SAP HANA is desirable. Effective communication and relationship-building skills, with an ability to influence and inspire. Passion for data modeling and visualization, coupled with eagerness to learn and adapt to new technologies. Preferred Qualifications: SAP Certified in BW/4HANA or Datasphere. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and relationship-building abilities. Proven ability to work independently and collaborate effectively within a team. Adaptable, proactive, and committed to continuous learning in emerging data technologies.
Posted 4 days ago
10.0 - 14.0 years
25 - 40 Lacs
Hyderabad
Work from Office
About Protiviti India Protiviti India is a global business consulting firm dedicated to helping leaders confidently face the future. Operating in over 25 countries with more than 90 offices worldwide, we are supported by over 11,000 professionals globally and achieved a global revenue of $2.10 billion as of January 2024. As a wholly-owned subsidiary of Robert Half (NYSE: RHI), we deliver deep expertise across various solutions including Internal Audit & Financial Advisory, Technology & Digital, Financial Services - Risk, Business Performance Improvement, and Managed Business Services. With a strong and growing presence across major Indian cities, Protiviti India prides itself on being a genuinely independent firm with proven methodologies and experienced professionals. Our vision is to bring confidence in a dynamic world, guided by values like integrity, innovation, and collaboration. We also maintain a strong partnership with the Confederation of Indian Industry (CII). We are currently expanding our Enterprise Application Services (EAS) within the Technology & Digital solution area, where we are proud to be a diamond-level partner for SAP Labs and globally ranked as the 11th preferred partner for SAP, with an ambitious goal to reach the top 5. About the Role: SAP Datasphere Architect We are seeking a highly skilled and experienced SAP Datasphere Architect to join our team. This pivotal role is for a seasoned professional with 10-12 years of experience in data architecture, data warehousing, and analytics, including at least 3 years of hands-on experience in SAP Datasphere (formerly SAP Data Warehouse Cloud). The ideal candidate will be instrumental in designing, implementing, and optimizing enterprise-level data integration and analytics solutions leveraging SAP Datasphere, alongside modern cloud and hybrid data architectures. Key Responsibilities: Design and architect scalable, high-performing, and secure data integration and analytics solutions using SAP Datasphere and associated SAP technologies. Provide technical leadership in integrating SAP Datasphere with critical enterprise systems like SAP S/4HANA, SAP BW/4HANA, SAP Analytics Cloud (SAC) , and various non-SAP systems. Define and implement robust data modeling strategies, leveraging SAP Datasphere capabilities such as spaces, semantic layers, and virtual access. Lead the end-to-end implementation of SAP Datasphere projects, from requirement gathering and solution design through development, testing, and deployment. Optimize data flows and ensure efficient data processing for large-scale datasets in both real-time and batch environments. Collaborate closely with business stakeholders to understand requirements and translate them into effective technical solutions, while providing mentorship to junior team members. Required Qualifications & Skills: 10-12 years of experience in data architecture, data warehousing, and analytics, with a minimum of 3 years of hands-on experience in SAP Datasphere . Proficiency in SAP Datasphere , including expertise in spaces, data modeling, and integration features. Strong understanding of the broader SAP ecosystem , including S/4HANA, BW/4HANA, HANA Cloud, SAC, and BTP. Experience with cloud platforms such as Azure, AWS, or Google Cloud, particularly in hybrid data landscapes. In-depth knowledge of data integration technologies , including ETL/ELT tools, APIs, and SAP Data Intelligence. Strong SQL and data modeling skills , with experience in star and snowflake schemas. Proven ability to lead technical teams and manage multiple priorities effectively. Excellent problem-solving, analytical, and stakeholder management abilities. Soft Skills: Excellent problem-solving and analytical skills, with a keen eye for detail. Strong communication and stakeholder management abilities, ensuring effective collaboration. Proven ability to lead technical teams and manage multiple priorities in a dynamic environment. Commitment to continuous learning and staying current with emerging data technologies.
Posted 4 days ago
6.0 - 11.0 years
10 - 14 Lacs
Chennai
Remote
What Youll Need BS or MS degree in Computer Science, Engineering, or a related technical field Strong SQL skills 6+ years of experience working with event instrumentation, data pipelines, and data warehouses, preferably acting as a data architect in a previous role Proficiency with systems design and data modeling Fluency with workflow management tools, like Airflow or dbt Experience with modern data warehouses, like Snowflake or BigQuery Expertise breaking down complex problems, documenting solutions, and sequencing work to make iterative improvements Familiarity with data visualization tools such as Mode, Tableau, and Looker Familiarity with programming skills, preferably in Python Familiarity with software design principles, including test-driven development About the Role Analytics Platform is on a mission to democratize learning by building systems that enable company-wide analytics and experimentation. By implementing sufficient instrumentation, designing intuitive data models, and building batch/streaming pipelines, we will allow for deep and scalable investigation and optimization of the business. By developing self-serve tools, we will empower executives, PMs, Marketing leadership & marketing managers to understand company performance at a glance and uncover insights to support decision making. Finally, by building capabilities such as forecasting, alerting, and experimentation, we will enable more, better, and faster decisions. What Youll Do Drive direct business impact with executive-level visibility Design technical architecture and implement components from the ground up as we transition to event-based analytics Work on the unique challenge of joining a variety of online and offline data sets, not just big data Learn and grow Data Science and Data Analytics skills (we sit in the same org!) Opportunity to grow into a Tech Lead/Manager, and mentor junior team members as we quickly grow the team Partner with infrastructure and product engineers to instrument our backend services and end-to-end user journeys to create visibility for the rest of the business Design, develop and monitor scalable and cost-efficient data pipelines and build out new integrations with third-party tools Work with data analysts and data scientists to design our data models as inputs to metrics and machine learning models Establish the best practices for data engineering Assess build vs buy tradeoffs for components in our company-wide analytics platform, which will inform decision-making for executives, PMs and Ops, etc. Opportunity to be founding member of the Data Engineer team based out of IN. Will have the autonomy to help shape the vision, influence roadmap and establish best practices for the team
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough