Jobs
Interviews

1052 Etl Processes Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will be responsible for collaborating with cross-functional teams to understand data requirements and design efficient data processing solutions. Your main tasks will include developing and maintaining ETL processes using Databricks and PySpark for large-scale data processing, optimizing and tuning existing data pipelines for optimal performance, and scalability. Additionally, you will create and implement data quality and validation processes to ensure data accuracy. Your role will also involve working closely with stakeholders to understand business needs and translate them into actionable data solutions. You will collaborate with the data science team to support machine learning model deployment and integration into production systems. Troubleshooting and resolving data-related issues promptly will be an essential part of your responsibilities. To qualify for this role, you must hold a Bachelor's degree in Computer Science, Engineering, or a related field. You should have proven experience working with Databricks, PySpark, and SQL in a professional setting. Strong proficiency in designing and optimizing ETL processes for large-scale data sets, as well as experience with data modeling, data warehousing, and database design principles, are required. Familiarity with cloud platforms such as AWS, Azure, or GCP is also preferred. Join us in Bengaluru, KA, India, and be a part of our dynamic team working on cutting-edge data processing solutions.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

At Capgemini Engineering, the global leader in engineering services, a diverse team of engineers, scientists, and architects collaborate to support innovative companies worldwide. From cutting-edge technologies like autonomous cars to life-saving robots, our digital and software experts provide unique R&D and engineering services across various industries. If you join us, you will embark on a career filled with opportunities where you can truly make a difference in a dynamic environment. In this role, you will be responsible for designing, developing, and optimizing PL/SQL procedures, functions, triggers, and packages. Your tasks will include writing efficient SQL queries, joins, and subqueries for data retrieval and manipulation, as well as developing and maintaining database objects such as tables, views, indexes, and sequences. Additionally, you will optimize query performance, troubleshoot database issues, and collaborate closely with application developers, business analysts, and system architects to understand database requirements. Ensuring data integrity, consistency, and security within Oracle databases will be a key aspect of your responsibilities. You will also develop ETL processes and scripts for data migration and integration, while documenting database structures, stored procedures, and coding best practices. Staying updated with Oracle database technologies, best practices, and industry trends is essential for this role. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Previous experience as a PL/SQL Developer working with Oracle databases is required, along with strong expertise in Oracle PL/SQL programming, including procedures, triggers, functions, and packages. Proficiency in performance tuning, query optimization, and indexing strategies is necessary, and familiarity with data modeling, database design, and normalization principles is advantageous. Knowledge of ETL tools, database migration, batch processing, and Oracle tools like SQL*Plus, Toad, or SQL Developer is preferred. The ability to troubleshoot and resolve database performance issues is also essential. Choosing to work at Capgemini means having the opportunity to drive change and contribute to leading businesses and society. You will receive the support needed to shape your career in a way that fits your goals. Joining Capgemini means becoming part of a diverse community of free-thinkers, entrepreneurs, and experts, collaborating to unleash human energy through technology for an inclusive and sustainable future. At Capgemini, people are at the core of everything we do. You can expand your career exponentially by participating in innovative projects and utilizing our extensive Learning & Development programs. We offer an inclusive, safe, healthy, and flexible work environment to help you unleash your potential. Additionally, you can engage in positive social change and contribute to a better world by actively participating in our Corporate Social Responsibility and Sustainability initiatives. While making a difference, you will also experience a supportive and enjoyable work culture. About the Company: Capgemini is a global business and technology transformation partner that assists organizations in accelerating their digital and sustainable transition, creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini boasts a strong heritage of over 55 years. Clients trust Capgemini to unlock the value of technology to address their comprehensive business needs. The company provides end-to-end services and solutions, leveraging strengths from strategy and design to engineering, powered by its leading capabilities in AI, cloud, and data, along with deep industry expertise and partner ecosystem. In 2023, the Group reported global revenues of 22.5 billion.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Celonis Data Engineer at MKS, you will be an integral part of the Process Center of Excellence (CoE) team, contributing to the development of Business Process Management (BPM) and Process Mining capabilities across all MKS divisions and corporate functions. Your primary responsibility will be to collaborate closely with Business Process Owners (BPOs) and Process Analysts in driving technical implementation and providing ongoing support for Celonis process mining solutions. Your impactful contributions will include: - Collaborating with Business Analysts, Business Process Owners, and Celonis Application Owner to translate business requirements and use cases into technical and data requirements. - Defining and deploying Celonis solutions such as apps & dashboards by identifying improvement opportunities in conjunction with Process Experts, Business Analysts, and Business Process Owners. - Identifying the necessary source system tables and fields to be integrated into Celonis. - Extracting, transforming, and loading all required data from the source systems for each process implemented within Celonis. - Constructing the Celonis data model for each process. - Validating the data within Celonis by collaborating with relevant business experts. - Monitoring and optimizing data query performance to ensure optimal response times. - Documenting all technical and data requirements as well as all extract, transform, load (ETL) work. - Providing continuous support for any data-related issues. To excel in this role, you should possess: - 5+ years of experience in Celonis as a prerequisite. - Proficiency in SQL coding. - Extensive experience in ETL processes. - Strong familiarity with relational databases and data modeling. - A solid comprehension of the data structures in core enterprise systems like SAP ERP, Oracle ERP, CRM. - Experience with various project management methodologies and phases of the project lifecycle (e.g., agile, dev ops, waterfall). - An initiative-driven and ownership-oriented approach, demonstrating confidence in tackling challenges. - Proficiency in English language skills at a minimum of level C1. Preferred skills that would be advantageous include: - Proficiency in Python (preferred). - Previous experience working in a multinational company. - Prior exposure to Celonis Process Mining solutions (mandatory). In this role, you will not have any supervisory or budgetary responsibilities. The job operates in a professional office environment, and the position offers a hybrid work model with 3 days in the office and 2 days remote.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You are a skilled and experienced Tableau Developer with a strong background in SQL. You have the ability to transform data into actionable insights using Tableau and SQL. Your responsibility includes developing interactive dashboards, reports, and conducting complex data analysis to support business decision-making. Key Responsibilities: Tableau Development: - Design, develop, and maintain interactive dashboards and reports in Tableau. - Create ad-hoc reports and custom visualizations as per business requirements. - Ensure dashboards are user-friendly, well-organized, and easily interpreted. - Collaborate with business stakeholders to gather requirements and translate them into effective Tableau solutions. SQL Development & Data Management: - Write complex SQL queries to extract, manipulate, and analyze data from relational databases. - Ensure data integrity and quality by applying efficient data validation techniques. - Design, implement, and optimize SQL queries and stored procedures to improve report performance. - Work with database administrators to ensure smooth integration of Tableau with various data sources. Data Analysis & Reporting: - Analyze large datasets to identify trends, patterns, and insights. - Present findings and recommendations to business leaders in a clear and concise manner. - Develop automated reporting processes and provide regular updates to key stakeholders. Collaboration & Troubleshooting: - Collaborate with business analysts, IT, and other stakeholders to ensure effective data flow and reporting solutions. - Troubleshoot and resolve issues related to Tableau dashboards, SQL queries, and data sources. - Provide user training and support for Tableau reports and dashboards. Performance Optimization: - Optimize performance of Tableau reports and dashboards to ensure faster load times and efficient querying. - Use best practices for data visualization, performance tuning, and query optimization. Skills and Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or related field. Experience: - Minimum of [X] years of experience as a Tableau Developer. - Proven experience with SQL, including creating complex queries, stored procedures, and database management. Technical Skills: - Strong proficiency in Tableau, including creating calculated fields, data blending, and creating interactive dashboards. - Expertise in SQL, including experience with MS SQL Server, MySQL, Oracle, or similar RDBMS. - Experience with ETL processes and data warehousing concepts. - Familiarity with Python, R, or other data analytics tools is a plus. Soft Skills: - Strong analytical and problem-solving skills. - Excellent communication skills and ability to work with business users to gather requirements. - Ability to prioritize tasks and manage time effectively in a fast-paced environment. - Strong attention to detail and commitment to data quality. Preferred Qualifications: - Knowledge of data modeling and database design principles. - Tableau Server or Tableau Online administration experience. Why Join Us - Competitive salary and benefits package. - Opportunity to work with cutting-edge technologies and grow your skill set. - Collaborative and innovative work environment.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a highly skilled BI Lead with 8-10 years of experience, you will be responsible for overseeing our BI initiatives, particularly focusing on Power BI and Qlik tools. Your primary role will involve leading the migration of existing reports to Power BI and providing technical leadership to a BI development team. You must possess a deep understanding of BI tools, data modeling, reporting, and data visualization best practices. Your responsibilities will include overseeing the end-to-end design, development, and deployment of business intelligence solutions to ensure alignment with business objectives and requirements. You will lead and manage the migration of reports and dashboards, ensuring a smooth transition with minimal disruption. Additionally, you will leverage your expertise in designing and developing complex dashboards and reports in Power BI and Qlik to meet business needs. Furthermore, you will be involved in designing and implementing data models, integrating data from multiple sources for reporting and analytics while ensuring optimal performance and scalability. Mentoring and providing technical guidance to a team of BI developers will also be a key aspect of your role. Collaboration with stakeholders from various departments to understand their reporting needs and provide data-driven solutions is crucial. Your experience with diverse databases and environments such as MongoDB, PostgreSQL, AWS, NoSQL, and Redshift will be beneficial. You should have proficiency in developing advanced Tableau dashboards, BOBJ reports, and leading BI development projects. Improving processes, optimizing performance, and identifying opportunities for enhancement will be part of your responsibilities. Moreover, you will engage in presales activities, including effort estimation, solution design, and proposal development to support business acquisition and client engagements. Effective communication with team members, scheduling, load balancing, and maintaining documentation according to IS methodology are essential for successful project management. In summary, as a BI Lead, your role will encompass driving BI projects, facilitating migration to Power BI, offering expertise in Power BI and Qlik, promoting data integration and modeling, leading and mentoring a development team, and collaborating with stakeholders to deliver effective business intelligence solutions.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Modeler - Erwin plays a crucial role in the design and implementation of data models that meet organizational needs. You will be responsible for translating business requirements into well-structured, reusable data models while ensuring data integrity and efficiency. Working closely with stakeholders, you will gather data requirements and translate them into models that provide input for database architectures. Utilizing the Erwin tool, you will enhance data management strategies and ensure compliance with governance standards. Your role is vital as it supports the company's ability to make data-driven decisions and derive insights that align with strategic objectives. Key Responsibilities - Design and maintain logical and physical data models using Erwin. - Collaborate with business analysts and stakeholders to gather data requirements and translate business processes into comprehensive data models. - Ensure data integrity, quality, and security in all modeling activities and implement best practices for data governance and management. - Develop and update metadata associated with data models and provide technical support for database design and implementation. - Conduct data profiling and analysis to define requirements, create data flow diagrams and entity-relationship diagrams, and review and refine data models with stakeholders and development teams. - Perform impact analysis for changes in the modeling structure, train and mentor junior data modeling staff, and ensure compliance with data standards and regulations. - Collaborate with ETL developers to optimize data extraction processes and document modeling processes, methodologies, and standards for reference. Required Qualifications - Bachelors degree in Computer Science, Information Technology, or a related field. - Minimum of 3 years of experience as a data modeler or in a related role with proven expertise in using Erwin for data modeling. - Strong knowledge of relational databases and SQL, experience in data architecture and database design principles, and familiarity with data warehousing concepts and practices. - Ability to analyze complex data structures, recommend improvements, understand data governance frameworks and best practices, and possess excellent analytical and problem-solving skills. - Strong communication and documentation skills, ability to work collaboratively in a team-oriented environment, experience with data integration and ETL processes, and ability to manage multiple projects and deadlines effectively. - Familiarity with data visualization and reporting tools is a plus, willingness to keep skills updated with ongoing training and learning, and certification in Data Modeling or equivalent is desirable. Skills: entity-relationship diagrams, data modeling, documentation skills, database design principles, ETL processes, SQL proficiency, data integration, data architecture, DAX, database design, data governance, data security, SQL, Power Query, data governance frameworks, relational databases, analytical skills, problem-solving, data quality, communication skills, data warehousing, analytical thinking, data flow diagrams, team collaboration, Erwin, data profiling,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a SAP HANA Database Developer with 3-5 years of experience, you will be part of a dynamic team responsible for designing, developing, and maintaining high-performance databases, optimizing SQL queries, and supporting SAP HANA-based projects. Your main focus will be on performance tuning, data modeling, and integration. Your key responsibilities will include designing, developing, and optimizing HANA database objects, troubleshooting SQL queries, developing data models and ETL processes, providing day-to-day support for SAP HANA systems, integrating databases with SAP applications and third-party tools, participating in system upgrades and migrations, maintaining documentation, and collaborating with cross-functional teams. To excel in this role, you must have hands-on experience in SAP HANA database development, strong knowledge of SAP HANA architecture, SQL, HANA Studio, and SQLScript, proven experience in performance tuning, expertise in data modeling, proficiency in ETL processes, and a solid understanding of database management and security protocols. Problem-solving skills, communication skills, and a Bachelor's degree in Computer Science or related field are also required. Preferred skills include experience with SAP S/4HANA or SAP BW/4HANA, familiarity with cloud platforms and services, knowledge of SAP Fiori and SAP UI5, experience with Agile development methodologies, and certifications related to SAP HANA.,

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

gurugram, haryana, india

On-site

About Our Team Our global team supports products education electronic health records that introduce students to digital charting and prepare them to document care in todays modern clinical environment. We have a very stable product that we&aposve worked to get to and strive to maintain. Our team values trust, respect, collaboration, agility, and quality. The Consumption Domain is a newly established domain, offering an exciting opportunity to play a crucial role in structuring and shaping its foundation . Our team is responsible for ensuring seamless data processing, validation, and operational efficiency , while continuously improving workflow optimization and incident management . We work closely with various stakeholders to drive accuracy, speed, and reliability in delivering high-quality data. With a problem-solving mindset and a data-driven approach , we aim to build scalable solutions that enhance business processes and improve overall user experience. About The Role Elsevier is looking for a Senior Analyst to join the Consumption Domain team, where you will play a crucial role in analyzing and interpreting user engagement and content consumption trends. The ideal candidate will possess strong technical expertise in data analytics, Databricks, ETL processes, and cloud storage, coupled with a passion for using data to drive meaningful business decisions. Responsibilities Analyze and interpret large datasets to provide actionable business insights. Leverage Databricks, working with RDDs, Data Frames, and Datasets to optimize data workflows. Design and implement ETL processes, job automation, and data optimization strategies. Work with structured, unstructured, and semi-structured data types, including JSON, XML, and RDF. Manage various file formats (Parquet, Delta files, ZIP files) and handle data storage within DBFS, FileStore, and cloud storage solutions (AWS S3, Google Cloud Storage, etc.). Write efficient SQL, Python, or Scala scripts to extract and manipulate data. Develop insightful dashboards using Tableau or Power BI to visualize key trends and performance metrics. Collaborate with cross-functional teams to drive data-backed decision-making. Maintain best practices in data governance, utilizing platforms such as Snowflake and Collibra. Participate in Agile development methodologies, using tools like Jira, Confluence. Ensure proper version control using GitHub. Requirements Bachelors degree in Computer Science, Data Science, or Statistics. Minimum of 4-5 years of experience in data analytics, preferably in a publishing environment. Proven expertise in Databricks, including knowledge of RDDs, DataFrames, and Datasets. Strong understanding of ETL processes, data optimization, job automation, and Delta Lake. Proficiency in handling structured, unstructured, and semi-structured data. Experience with various file types, including Parquet, Delta, and ZIP files. Familiarity with DBFS, FileStore, and cloud storage solutions such as AWS S3, Google Cloud Storage. Strong programming skills in SQL, Python, or Scala. Experience creating dashboards using Tableau or Power BI is a plus. Knowledge of Snowflake, Collibra, JSON-LD, SHACL, SPARQL is an advantage. Familiarity with Agile development methodologies, including Jira and Confluence. Experience with GitLab for version control is beneficial. Skills And Competencies Ability to handle large datasets efficiently. Strong analytical and problem-solving skills. Passion for data-driven decision-making and solving business challenges. Eagerness to learn new technologies and continuously improve processes. Effective communication and data storytelling abilities. Experience collaborating with cross-functional teams. Project management experience in software systems is a plus. Work in a way that works for you We promote a healthy work/life balance across the organization. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance, and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai)" About The Business A global leader in information and analytics, we help researchers and healthcare professionals advance science and improve health outcomes for the benefit of society. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science and research, health education and interactive learning, as well as exceptional healthcare and clinical practice. What you do every day will help advance science and healthcare to advance human progress. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Location - Airoli, Mumbai Contract Duration - 9 months (Extendable to a year) Key Responsibilities: Design and implement scalable data intelligence platforms that integrate structured and unstructured data sources. Develop and fine-tune Generative AI models (LLMs, GANs, Diffusion Models) for tasks such as text generation, synthetic data creation, and content automation. Collaborate with data engineers, ML engineers, and product teams to deploy AI solutions into production environments. Build and maintain data pipelines , feature stores, and model monitoring systems. Apply prompt engineering techniques to optimize LLM performance for business use cases. Conduct experiments and A/B testing to evaluate model performance and business impact. Ensure compliance with data governance, privacy, and ethical AI standards. Expertise in AKS. Should be able to integrate the solution in current landscape. Required Qualifications: Bachelors or Masters degree in Computer Science, Data Science, AI, or related field. 3-5 years of experience in data science, machine learning, or AI engineering. Hands-on experience with LLMs (e.g., GPT, Claude, LLaMA) and frameworks like Hugging Face, LangChain, or OpenAI APIs. Proficiency in Python , SQL, and cloud platforms (AWS, Azure, or GCP). Strong understanding of data modeling , ETL processes , and MLOps . Experience with vector databases , embedding models , and retrieval-augmented generation (RAG) is a plus. Preferred Skills: Experience with knowledge graphs , semantic search , or data fabric architectures . Familiarity with AutoML , RLHF , or multi-modal AI . Excellent communication and stakeholder management skills. Show more Show less

Posted 1 week ago

Apply

10.0 - 12.0 years

4 - 8 Lacs

bengaluru, karnataka, india

On-site

Responsibilities: Lead the design, development, and optimization of complex data solutions using Python, Spark, SQL, Snowflake, dbt, machine learning, and other relevant technologies. Apply deep domain expertise in operations organizations, particularly in functions like supply chain, product life cycle management, and manufacturing, to understand data requirements and deliver tailored solutions. Utilize big data processing frameworks such as Apache Spark to process and analyze large volumes of operational data efficiently. Implement advanced data modeling, machine learning algorithms, and predictive analytics to derive actionable insights and drive operational decision-making. Leverage cloud-based data platforms such as Snowflake to store, manage, and analyze structured and semi-structured operational data at scale. Utilize dbt (Data Build Tool) for data modeling, transformation, and documentation to ensure data consistency, quality, and integrity. Monitor and optimize complex data pipelines, ETL processes, and machine learning models for performance, scalability, and reliability in operations contexts. Conduct data profiling, cleansing, and validation to ensure data quality and integrity across different operational data sets. Collaborate closely with cross-functional teams, including operations stakeholders, data scientists, business analysts, and IT teams, to understand operational challenges and deliver actionable insights. Stay updated on emerging technologies, best practices, and industry trends in data engineering, machine learning, and operations management, contributing to continuous improvement and innovation within the organization. All listed requirements are deemed as essential functions to this position; however, business conditions may require reasonable accommodations for additional tasks and responsibilities. Preferred Experience/Education/Skills: Bachelors degree in Computer Science, Engineering, Operations Management, or related field. Advanced degree (e.g., Masters or PhD) preferred. 10+ years of experience in data engineering, with proficiency in Python, Spark, SQL, Snowflake, dbt, machine learning, and other relevant technologies. Strong domain expertise in operations organizations, particularly in functions like supply chain, product life cycle management, and manufacturing. Strong domain expertise in life sciences manufacturing equipment, with a deep understanding of industry-specific challenges, processes, and technologies. Experience with big data processing frameworks such as Apache Spark and cloud-based data platforms such as Snowflake. Hands-on experience with data modeling, ETL development, machine learning, and predictive analytics in operations contexts. Familiarity with dbt (Data Build Tool) for managing data transformation and modeling workflows. Excellent problem-solving skills, analytical thinking, and attention to detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and operations stakeholders. Experience with SAP, SAP HANA and Teamcenter applications is a plus. Eagerness to learn and adapt to new technologies and tools in a fast-paced environment.

Posted 1 week ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

bengaluru, karnataka, india

On-site

Key Responsibilities: Leadership & Strategy Develop and own the enterprise-wide Master Data Management strategy and roadmap. Lead and mentor a team of MDM analysts and specialists. Partner with stakeholders across departments (Operations, Engineering, Supply Chain, IT, Compliance) to define data governance policies and business rules. Operations & Execution Oversee end-to-end master data processes, including material master, BOMs, customers, vendors, and other key data domains. Ensure the timely creation, maintenance, and retirement (End of Life) of master data records in systems such as SAP, Teamcenter, and ERP platforms. Implement data quality controls, audit processes, and exception handling workflows. Resolve data-related issues in a timely and accurate manner. Governance & Compliance Define and enforce data standards, naming conventions, and metadata structures. Collaborate with compliance and regulatory teams to ensure master data adheres to relevant policies and regulations (e.g., FDA, ISO). Maintain documentation for standard operating procedures and process flows. Systems & Tools Serve as the subject matter expert for data models and integrations between SAP, Teamcenter, and other ERP systems. Work with IT and digital transformation teams on MDM tool implementation and automation opportunities. Qualifications: Education & Experience Bachelor s degree in information systems, Engineering, Business, or related field (Master s preferred). 7+ years of experience in Master Data Management, with at least 2-3 years in a leadership or managerial role. Experience working with SAP, Teamcenter, or other PLM/ERP systems is required. Experience in life sciences, manufacturing, or high-regulation industries preferred. Skills & Competencies Strong understanding of master data domains (material, BOM, customer, vendor, etc.). Proficient in data governance frameworks and data quality management. Excellent problem-solving, communication, and interpersonal skills. Demonstrated ability to lead cross-functional initiatives and manage competing priorities. Experience with MDM tools (e.g., SAP Master Data Governance, SAP S/4 HANA) is a plus.

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for data engineering and integration development by designing, developing, and optimizing data pipelines and ETL workflows to support People Systems data needs. You will collaborate with development teams to configure and maintain data platforms such as Snowflake, Informatica, Power BI, and Domo. Additionally, you will assist in defining data models, schemas, and transformation logic to enable effective reporting and analytics. Your role will also involve implementing API-based and batch integrations between Oracle Fusion Cloud HCM, Learning Pool (Learn!Now), and other enterprise systems. In the capacity of solution design and implementation, you will analyze business and functional requirements to develop scalable data and integration solutions for People Systems applications. You will contribute to technical design decisions, ensuring solutions are efficient, secure, and aligned with enterprise architecture. It will be part of your responsibilities to develop and maintain technical documentation, including data flow diagrams, API specifications, and system workflows. As part of an Agile Scrum team, you will participate in sprint planning, backlog grooming, and retrospectives. You will collaborate with Product Owners and the Yum! Data Science team to refine data requirements and identify opportunities for process automation. Supporting the Associate Manager, Data Engineering, in executing data strategy and roadmap initiatives will also be crucial. You will monitor and optimize ETL jobs, data pipelines, and integration performance to enhance efficiency and reduce latency. Troubleshooting and resolving data quality and integration issues in collaboration with development teams will be a key aspect of your role. It is essential to implement best practices for data governance, security, and compliance. Stakeholder engagement and communication are vital components of this role. You will work closely with the Associate Manager, System Design, and U.S.-based technical leadership to align solutions with business priorities. Translating complex technical concepts into business-friendly language to facilitate stakeholder communication will be required. Ensuring alignment with security, compliance, and performance standards set by the enterprise is also part of the role. Minimum Requirements: - BE/BTECH - CS / IT, Information Systems, Data Engineering - 8-10 years of experience in data engineering, system design, and technical solutioning - Experience with ETL processes, data pipelines, and integration architectures Technical Skills: - Strong expertise in SQL, Python, or similar programming languages for data transformation - Hands-on experience with data integration tools (e.g., Informatica, API-based integrations, Snowflake, Power BI, Domo) - Experience with cloud-based platforms (AWS, Azure) and API development - Familiarity with Agile methodologies and tools like Azure DevOps or Jira - Preferred but not required: Familiarity with Oracle Fusion Cloud HCM and Learning Pool (Learn!Now) Soft Skills: - Strong analytical and problem-solving skills - Effective communication and collaboration abilities, especially with remote and cross-functional teams - Ability to work independently while aligning with team goals and business needs Note: Early Joiners preferred, 4 days (Mon-Thurs) - work from office preferred. Final Interviews to be scheduled at our Gurugram office in the week starting 3rd March 2025.,

Posted 1 week ago

Apply

6.0 - 12.0 years

25 Lacs

hyderabad, telangana, india

On-site

ROLE SUMMARY Sompo is seeking a Data & BI QA Tester to participate in the quality assurance processes, performing tests of reports, data transformation and load to identify, isolate and help resolve issues. This position will also suggest root-cause scenarios. ROLE RESPONSIBILITIES Work with different IT teams to verify reports, data extract, transformation and load steps conform to specified exit criteria and standards. Design and execute test scenarios, develop, and document data test plans based on requirements and technical specifications. Identify, analyze, and document all defects. Perform triage to ascertain possible causes for errors. Record and document results of test execution, distributing the defect tracking report. Ensure compliance with general quality assurance best practices, accepted industry standards and those standards set forth by upstream sources. Create SQL queries to valid results against source data. TECHNICAL QUALIFICATIONS 5 years of experience with mostly used Microsoft Office suites. 5 years of experience testing data transformations. 5 years of knowledge and understanding of basic SQL. 3 years of experience testing reports (using tools like Cognos and Microstrategy). Working knowledge of ETL processes. Good understanding of SDLC methodology. Knowledge of Property and Casualty Insurance is a plus. Experience with a defect tracking system like TFS. GENERAL QUALIFICATIONS 5 years of experience documenting test results. 5 years of experience writing manual test cases. 5 years of experience executing manual and automated test cases. Excellent written and verbal communications skills. Excellent problem solving and analytical skills. Ability to create and execute a test plan. Ability to work with a team to test reports, data loads and determine root causes of failures. EDUCATION REQUIREMENTS Bachelor's degree in Computer science or equivalent required. Certification in Quality Assurance desired.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. Your unique voice and perspective are key in helping EY become even better. Join us in building an exceptional experience for yourself and contribute to creating a better working world for all. The Reporting and Data Analytics (R&DA) team at EY is responsible for delivering key management and financial reporting across the firm. As a PBI and SSRS Report Developer - Supervising Associate, you will work with a team of developers to support and build reports using Power BI and SSRS. These reports are generated from pre-prepared datasets or through SQL and Power Query processes. You will collaborate with the broader R&DA technical team to ensure the underlying reporting environment, datasets, and data refresh processes are maintained. Additionally, you will work closely with the Product Line teams to interpret and refine customer requirements into technical specifications and reporting applications. Your responsibilities will include delivering intuitive reporting solutions, managing ETL processes, collaborating with different teams, translating requirements into work estimates, investigating and resolving data issues, ensuring data security, and managing the development cycle. To succeed in this role, you should have advanced skills in Power BI report development, experience with DAX, knowledge of ETL processes, proficiency in SQL queries, effective communication skills, and the ability to adapt to changing priorities. Experience with Power Platform tools, project management, Azure DevOps, and Visual Studio would be beneficial. You should have at least 5 years of experience in a finance or technical department, with a high motivation to learn and collaborate with experienced colleagues. This role offers continuous learning opportunities, transformative leadership experiences, and a diverse and inclusive culture. Join us at EY to contribute to building a better working world where trust, value, and innovation thrive through the collective efforts of diverse and talented teams across the globe.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a SAP SAC Developer in Bangalore, you will collaborate with functional and technical teams to integrate SAP Analytics Cloud with other SAP and non-SAP systems. Your responsibilities will include performing data analysis, data cleansing, and data transformation tasks to ensure data accuracy and reliability within SAP Analytics Cloud. Additionally, you will provide ongoing support and maintenance for SAP Analytics Cloud solutions, which involves troubleshooting, resolving issues, and enhancing existing functionalities. As part of your role, you should be able to train teams and stakeholders to stay updated. Your primary skills should include a very good understanding of data visualization techniques, dashboarding, and KPIs. You must have extensive hands-on experience with SAP Analytics Cloud, covering data modeling, story creation, dashboard design, and administration. It is essential to possess a good understanding of data visualization and analytics best practices to design compelling and user-friendly dashboards and visualizations. Strong knowledge of data connections and data integration with SAP and non-SAP systems, including experience with data extraction, transformation, and loading (ETL) processes is required. Proficiency in SQL, relational databases, and data warehousing concepts is also necessary. You should have strong problem-solving skills and the ability to analyze complex business requirements to translate them into technical solutions. Building and maintaining clear and up-to-date documentation for code, configurations, and processes is another crucial aspect of this role. Functional knowledge of the finance domain and SAP Analytics Cloud certification will be advantageous. A minimum of 5-6 years of experience in reporting tools like SAP Business Objects, Lumira, with 3-4 years of experience in SAC is preferred. Apart from the primary skills, you are expected to possess excellent verbal and written communication skills. You will collaborate in a working environment between onshore and offshore teams and work closely with cross-functional teams to understand and address business requirements.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for facilitating client workshops and gathering functional requirements to convert them into technical designs and compile a solution Blueprint. Your role will involve defining system integration processes and developing these processes from all source systems as per the requirement. You will be required to implement and test solution models to enable business planning and reporting based on the agreed solution design. Additionally, you will compile all functional and technical specifications and documentation, including Blueprint, Unit Test Cases, UAT Test Cases, Functional Design Specifications, Technical Design Specifications, and training material for implemented content. As a SAP Analytics Cloud Solution Architect, you will perform quality assurance testing on other consultants" development to ensure adherence to relevant project governance metrics. You will also keep technology and service managers informed of key customer issues and collaborate with project management on the creation of project plans, reporting project status, issues, risks, and benefits. Furthermore, you will be responsible for managing and mentoring other team members while playing the role of technology expert and solution owner. Your primary skills should include a very good understanding of data visualization techniques, dashboarding, and KPIs. You should have extensive hands-on experience with SAP Analytics Cloud, including data modeling, story creation, dashboard design, and administration. A good understanding of data visualization and analytics best practices is essential, with the ability to design compelling and user-friendly dashboards and visualizations. Strong knowledge of data connections and data integration with SAP and non-SAP systems, including experience with data extraction, transformation, and loading (ETL) processes, is required. Proficiency in SQL and experience working with relational databases and data warehousing concepts is also necessary. You should possess strong problem-solving skills and the ability to analyze complex business requirements and translate them into technical solutions. Building and maintaining clear and up-to-date documentation for code, configurations, and processes is part of the role. Functional knowledge of the finance domain and SAP Analytics Cloud certification are added advantages. A minimum of 5-6 years of experience in any of the reporting tools like SAP Business Objects, Lumira, with 7-8 years of experience in SAC is required. Your secondary skills should include sound judgment, logical thinking, and strong analytical skills. Excellent verbal and written communication skills are essential, along with the ability to collaborate in a working environment between onshore/offshore. You should also be able to work closely with cross-functional teams to understand and address business requirements.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As the Manager, Data Engineering at Mr. Cooper Group, you will lead a team of data engineers dedicated to designing, developing, and maintaining robust data systems and pipelines. Your role is crucial in ensuring the seamless collection, transformation, and storage of data, making it readily accessible for analytics and decision-making throughout the organization. Collaborating closely with data scientists, analysts, and other stakeholders, you will guarantee that the data infrastructure aligns with business needs, emphasizing scalability, reliability, and efficiency. Your responsibilities will include team leadership, overseeing data infrastructure management, fostering collaboration across departments, driving process improvements, ensuring data governance, selecting appropriate technologies, and managing data engineering projects effectively. Your technical expertise should encompass a strong grasp of data engineering concepts, proficiency in SQL, Python, and other relevant tools, and experience in data warehousing, ETL processes, and data pipeline design. Moreover, your leadership skills, problem-solving abilities, communication proficiency, and knowledge of data architecture will be instrumental in fulfilling the requirements of this role. The ideal candidate should possess over 5 years of experience in data engineering, with a minimum of 2 years in a managerial or leadership capacity. Preferred qualifications include familiarity with Big Data technologies like Hadoop and Spark, expertise in data modeling and governance best practices, and a background in machine learning concepts and their integration with data pipelines. Experience with cloud platforms such as AWS, Google Cloud, or Azure is considered advantageous. If you are passionate about leveraging data to drive business decisions, thrive in a collaborative environment, and are eager to contribute to the dream of homeownership, Mr. Cooper Group welcomes you to join our team in Chennai, Tamil Nadu, India. Requisition ID: 023563 Job Category: Information Technology Location: Chennai, Tamil Nadu, 600089, India,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer & Architect with 10-12 years of experience, you will be responsible for designing and implementing enterprise-grade Data Lake solutions using AWS technologies such as S3, Glue, and Lake Formation. Your expertise in building Data Lakes and proficiency with AWS tools like S3, EC2, Redshift, Athena, and Airflow will be essential in optimizing cloud infrastructure for performance, scalability, and cost-effectiveness. You will be required to define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, compute, and processing. Developing and maintaining ETL pipelines using tools like AWS Glue, creating robust Data Warehousing solutions using Redshift, and ensuring high data quality and integrity across all pipelines will be key aspects of your role. Collaborating with business stakeholders to define key metrics, deliver actionable insights, and designing and deploying dashboards and visualizations using tools like Tableau, Power BI, or Qlik will be part of your responsibilities. You will also be involved in implementing best practices for data encryption, secure data transfer, and role-based access control to maintain data security. As a Senior Data Engineer & Architect, you will lead audits and compliance certifications, work closely with cross-functional teams including Data Scientists, Analysts, and DevOps engineers, and mentor junior team members. Your role will also involve partnering with stakeholders to define and align data strategies that meet business objectives. Clovertex offers a competitive salary and benefits package, reimbursement for AWS certifications, a Hybrid work model for maintaining work-life balance, and health insurance and benefits for employees and dependents. If you have a Bachelor of Engineering degree in Computer Science or a related field, AWS Certified Solutions Architect Associate certification, and experience with Agile/Scrum methodologies, this role is perfect for you.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

Are you ready to contribute to Mondelz International's mission to lead the future of snacking with pride As a Data Scientist on the team, you will play a key role in analyzing large amounts of data to identify patterns, trends, and insights that will help the business adapt to changing conditions. Your responsibilities will include defining analysis requirements, performing detailed analysis, conducting root-cause analysis, and working with various data science tools and techniques. In this role, you will collaborate with cross-functional teams to develop cutting-edge Commercial Reporting solutions with Analytical AI and GenAI capabilities. You will leverage AI algorithms to gather insights, identify anomalies, and build GenAI capabilities that will enhance the company's reporting processes. Your proficiency in designing and deploying Analytical AI solutions and familiarity with GenAI concepts will be crucial in translating complex data scenarios into actionable business strategies. Your job requirements will involve designing and implementing AI models, leveraging GenAI capabilities, collaborating with multiple teams for smooth implementation, updating AI solutions with new data sources, and maintaining documentation of AI models and GenAI solutions. You will also be responsible for reporting on tool achievements, challenges, and enhancements to senior management and stakeholders. To excel in this role, you should have a Bachelor's degree in Information Systems/Technology, Computer Science, Analytics, or a related field, strong analytical and critical thinking skills, proficiency in Python and SQL for handling large datasets, experience with data visualization tools like Tableau and PowerBI, and familiarity with cloud platforms for deploying AI solutions. Additionally, knowledge of NLP techniques, recent advancements in GenAI, ethical implications of AI, and understanding of the consumer goods industry will be beneficial. If you have a minimum of 6+ years of relevant experience and possess the skills and experiences mentioned above, you are encouraged to apply for this exciting opportunity at Mondelz International. Join us in empowering people to snack right with delicious, high-quality snacks made sustainably and enjoyed globally.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You have over 8 years of experience and possess skills in Oracle PL/SQL, RevPro Exp, and RightRev Exp. The job is located in Chennai and requires immediate availability for a hybrid work setup with morning and evening shifts. As an Application Support Specialist, your key responsibilities include analyzing and resolving complex application issues utilizing your expertise in Oracle PL/SQL and Snowflake. You will be expected to perform advanced debugging, optimize SQL queries and stored procedures, troubleshoot data integrity and security issues, and collaborate with different teams to resolve technical problems efficiently. Automation of operational tasks, technical documentation maintenance, data validation, and collaboration with vendors are also part of your role. To qualify for this position, you should have a Bachelor's degree in Computer Science or related field, 8+ years of experience in Application Support with a focus on Revenue Management Systems, and a strong background in Oracle PL/SQL development. Your expertise should include handling large datasets, optimizing queries, and troubleshooting performance issues. Proficiency in Snowflake data warehouse solutions, ETL processes, and database performance tuning is essential. Additionally, you should be willing to work in shifts, including nights, weekends, and public holidays. Preferred qualifications include experience with cloud platforms, DevOps practices, CI/CD pipelines, APIs, and microservices architecture. Familiarity with containerization technologies like Docker and Kubernetes is a plus. If you meet the required qualifications and are looking for a challenging opportunity in a fast-paced environment, please consider applying for this role by emailing us at shreshta@proqubix.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

You are a highly skilled and experienced Senior SQL Database Developer who will be responsible for designing, developing, and maintaining complex SQL databases to ensure efficiency, security, and scalability. Your deep expertise in SQL development, performance tuning, database design, and optimization, along with a solid understanding of database architecture, will be crucial. You will work closely with cross-functional teams to deliver high-quality data solutions. Your responsibilities will include developing and designing complex SQL queries, stored procedures, triggers, functions, and views. You will design, implement, and optimize relational database schemas, create and maintain database objects, and ensure data integrity and optimization. Performance tuning and optimization will be key, as you analyze query performance, optimize SQL statements, and use techniques like indexing and partitioning to improve database performance. You will also lead data integration efforts, support data migration projects, collaborate with various stakeholders, provide technical guidance, participate in code reviews, troubleshoot database issues, and document database structures and procedures. Key qualifications include a Bachelor's degree in computer science or related field, 5-7 years of experience in SQL development, and expertise in SQL, SQL Server, T-SQL, database performance tuning, troubleshooting, and security best practices. Experience with large-scale SQL databases, cloud databases, data integration, and ETL processes is preferred. Strong analytical, problem-solving, communication, and teamwork skills are essential. You will enjoy a full-time position with flexible work hours, but only work from the office is available. Occasional on-call support for production database systems may be required. You will benefit from a competitive salary and benefits package, opportunities for career growth, a dynamic work environment, and exposure to cutting-edge database technologies and large-scale systems.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As a Senior Business Intelligence (BI) Analyst at Amherst, you will be a key player in turning raw data into actionable insights that drive business decisions. You will collaborate with various business units to gather reporting requirements and offer analytical support. Your responsibilities will include writing and optimizing SQL queries, designing and improving Tableau dashboards, working on data ETL processes, and translating business logic into data-driven solutions. You will also develop new metrics, drive process improvements, conduct in-depth analyses, and document data definitions and business rules. The ideal candidate for this role should have a minimum of 4 years of hands-on experience in a BI Analyst or similar position, expertise in SQL and Tableau, and prior experience in the Banking or Financial Services industry. A strong analytical mindset, excellent communication skills, and stakeholder management abilities are essential. A Bachelor's degree in computer science, Statistics, Finance, or a related field is required. In addition to technical skills, soft skills such as problem-solving, critical thinking, communication, teamwork, attention to detail, and the ability to work with complex data sets are highly valued. Proficiency in SQL, Python, VBA, and JavaScript, along with experience in Tableau and other BI tools like Power BI or Looker, is preferred. Exposure to Databricks for building and managing data pipelines and performing advanced analytics is a plus. At Amherst, core values such as positive attitude, integrity, client-centricity, business acumen, communication, collaboration, execution, agility, and community are highly emphasized. The working shift for this role is the US Shift (1:30 PM - 10:30 PM IST) with a flexible hybrid working model. If you are passionate about working with data, solving complex problems, and driving business decisions through insights, this role as a Senior BI Analyst at Amherst might be the perfect fit for you.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a highly skilled Tableau Lead Developer with over 8 years of experience in Tableau and more than 5 years in SQL and data modeling. Yavda Analytics is known for its Analytics & BI services in the US market and a SaaS BI product with 400+ licensed customers globally. With over 20 years of data and analytics experience, we offer top-notch analytics insights to our clients. Our SaaS product, PBIVizEdit, caters to a wide range of customers from SMBs to Fortune 500 companies. As a Tableau Lead Developer, your responsibilities will include designing, developing, and maintaining interactive dashboards and reports in Tableau to meet business requirements. You will be tasked with optimizing Tableau dashboards for enhanced performance, scalability, and usability. Additionally, you will develop complex calculations, parameters, LOD expressions, and custom visualizations in Tableau. Collaboration with business analysts, data engineers, and stakeholders is essential to translate business needs into meaningful visual analytics. You will also be responsible for creating and managing data models, data blending, and relationships within Tableau while implementing governance and security protocols for Tableau Cloud/Server environments. Providing training and support to business users on self-service analytics using Tableau will also be part of your role. The ideal candidate should possess at least 8 years of experience in Tableau development and BI reporting, along with a strong proficiency in SQL, including writing complex queries, stored procedures, and performance tuning. Experience working with databases such as SQL Server, Snowflake, Redshift, or BigQuery is required. A solid understanding of data warehousing, dimensional modeling, and ETL processes is crucial, along with familiarity with Tableau Cloud/Server administration, data visualization best practices, and UX/UI principles. Preferred qualifications include Tableau Desktop & Server Certification, such as Tableau Certified Data Analyst or Tableau Desktop Specialist. Additionally, familiarity with cloud-based data platforms (AWS, Azure, GCP), experience with Alteryx, Power BI, or other BI tools, exposure to Python, R, or scripting languages for advanced analytics, and background in marketing analytics, e-Commerce, retail, or cpg are advantageous. This position offers various work modes including remote, hybrid, or on-site based on the company's policy. Join us at Yavda and seize the opportunity for career growth, mentorship from data leaders with over 20 years of industry experience, and benefit from 22 paid annual leave days compared to the industry standard of 15. If you are prepared to contribute your creativity, passion, and expertise to a company that is making a difference, apply today and share why you are the perfect fit for Yavda.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

telangana

On-site

As a Senior Tableau Developer, you will play a crucial role in designing, developing, and implementing interactive and visually compelling dashboards and reports using Tableau. You will work closely with stakeholders to understand their data needs and deliver high-quality solutions that enable data-driven decision-making. Key Responsibilities: Design and Develop Dashboards: Create interactive and visually appealing dashboards and reports in Tableau that meet business requirements and provide actionable insights. You should have a thorough understanding of Analytics Pane, Joins, Blends & Relationships, Logical Layer & Physical Layer, BI Data Modelling, Selection of ideal chart based on the scenario, Calculated Field, Data Types in Tableau, Dimension & Measure, Discrete & Continuous Data, LOD Expressions, Groups, Bins, Sets, Parameters, Actions, Advanced conditional formatting, Live & Extract, Types of filters in order of hierarchy and the elements affected by them. Data Integration: Connect to various data sources, perform data blending, and ensure data accuracy and consistency. Optimization: Optimize Tableau workbooks for performance and scalability, ensuring efficient data retrieval and dashboard responsiveness. Collaboration: Work closely with business analysts, data engineers, and other stakeholders to gather requirements and translate them into Tableau solutions. Best Practices: Follow and promote best practices in Tableau development, including data visualization principles, data governance, and user experience. Troubleshooting: Identify and resolve issues related to Tableau dashboards, data sources, and performance. Documentation: Create and maintain documentation for Tableau solutions, including design specifications, user guides, and maintenance procedures. Training and Support: Provide training and support to end-users on Tableau tools and functionalities, helping them effectively use the dashboards and reports. Requirements: Experience: Minimum of 4 years of experience as a Tableau Developer, with a strong portfolio of Tableau dashboards and reports. Technical Skills: Proficiency in Tableau Desktop, Tableau Server, and Tableau Prep Builder. Strong understanding of SQL and data modeling. Data Visualization: In-depth knowledge of data visualization best practices and principles, with the ability to create intuitive and impactful dashboards. Analytical Skills: Strong analytical and problem-solving skills, with the ability to interpret complex data and provide actionable insights. Communication: Excellent verbal and written communication skills, with the ability to effectively interact with stakeholders and present findings. Education: Master's degree in Computer Science, Information Systems, Data Analytics, or a related field. Advanced degree or relevant certifications (e.g, Tableau Desktop Certified Professional) are a plus. Soft Skills: Ability to work independently and as part of a team, manage multiple priorities, and meet deadlines. Preferred Qualifications: Experience with other BI tools and technologies. Knowledge of data warehousing concepts and ETL processes. Familiarity with programming languages such as Python or R.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

We are seeking candidates with over 7 years of experience for the role of Senior Data Engineer & Lead specializing in Snowflake and ETL. As a Senior Data Engineer & Lead, your responsibilities will revolve around querying and analyzing data stored in Snowflake databases to extract valuable insights that drive business decision-making processes. You will be tasked with developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. Collaboration with business stakeholders is essential to understand data requirements and translate them into effective analytical solutions. Data validation, quality assurance, and cleansing activities within Snowflake databases will also be part of your role. In this position, you will provide technical leadership and guidance to the data team, ensuring the adoption of best practices and the delivery of high-quality outcomes. Effective communication with customers to gather requirements, provide updates, and align on priorities is crucial. Your role will require flexibility in managing shifting priorities, engaging in cross-functional collaboration, and adapting to evolving business needs. Leading large data teams in delivering complex data solutions across multiple workstreams and performing performance tuning and optimization of Snowflake queries, ETL jobs, and data models to ensure efficient processing and faster insights are key aspects of this role. Primary Skills: - Proficiency in querying and analyzing data using Snowflake SQL. - Strong understanding of data modeling and schema design within Snowflake environments. - Experience in data visualization and reporting tools (e.g., Power BI, Tableau, Looker) for analyzing and presenting insights derived from Snowflake. - Familiarity with ETL processes and data pipeline development. - Proven track record of utilizing Snowflake for complex data analysis and reporting tasks. - Expertise in Dimension Modeling is required. - Strong problem-solving and analytical skills, with the ability to derive actionable insights from data. - Experience with programming languages (e.g., Python, R) for data manipulation and analysis. Secondary Skills: - Excellent communication and presentation skills. - Strong attention to detail and a proactive approach to problem-solving. - Ability to work collaboratively in a team environment. If you possess the necessary skills and experience and are looking for a challenging opportunity, we encourage you to apply for the Senior Data Engineer & Lead position focusing on Snowflake and ETL. The location for this role is in Bangalore, Trivandrum, or Kochi, with the close date for applications set on 30-05-2025.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies