Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
5 - 9 Lacs
Ahmedabad
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that meet the evolving needs of the business. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in data modeling. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data modeling techniques and best practices.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to work with various database technologies and data storage solutions. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Ahmedabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
12.0 - 15.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Mentor junior professionals in best practices for data engineering.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and data governance practices.- Knowledge of cloud platforms and services related to data analytics. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Data Integrator (ODI) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Data Integrator (ODI).- Strong understanding of data integration processes and ETL methodologies.- Experience with database management systems and SQL.- Familiarity with data warehousing concepts and practices.- Ability to troubleshoot and optimize data integration workflows. Additional Information:- The candidate should have minimum 5 years of experience in Oracle Data Integrator (ODI).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms such as AWS or Azure.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As part of a Data Transformation programme you will be part of the Data Marketplace team. In this team you will be responsible for Architecture and design for automating data management compliance validation, monitoring, and reporting through rule-based and AI-driven mechanisms, integrating with metadata repositories and governance tools for real-time policy enforcement and for delivering design specifications for real-time metadata integration, enhanced automation, audit logging, monitoring capabilities, and lifecycle management (including version control, decommissioning, and rollback) Preferably experience with the implementation and adaptation of data management and data governance controls around Data Product implementations, preferably on AWS. Experience with AI appreciated.Examples skills Data Architecture, Data Marketplace, Data governance, Data Engineering, AWS DataZone, AWS Sagemaker Unified StudioAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and deliver high-quality solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure best practices and quality standards are maintained. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with application development lifecycle methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 3 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud-based application development and deployment.- Familiarity with agile development methodologies and practices.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
12.0 - 15.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage in strategic planning and decision-making processes, ensuring that the applications align with organizational objectives and user needs. Your role will require you to balance technical expertise with leadership skills, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training and knowledge-sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database technologies and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 12 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Glue Good to have skills : Python (Programming Language), Amazon Web Services (AWS), Machine LearningMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows to ensure clarity and consistency.- Engage in code reviews and provide constructive feedback to peers to foster a culture of continuous improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Good To Have Skills: Experience with Python (Programming Language), Amazon Web Services (AWS), Machine Learning.- Strong understanding of data integration and ETL processes.- Experience with cloud-based application development and deployment.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions to meet business needs.- Conduct code reviews and ensure compliance with coding standards.- Troubleshoot and debug applications to optimize performance.- Stay updated on emerging technologies and trends to suggest improvements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and methodologies.- Hands-on experience in developing and maintaining data pipelines.- Knowledge of SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
5.0 - 8.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners KPI Partners is a leading provider of business intelligence and analytics solutions, focused on helping businesses harness their data for actionable insights. We pride ourselves on a culture of innovation, collaboration, and excellence. Our team is dedicated to empowering organizations through advanced analytics, data visualization, and strategic decision-making. Job Overview We are seeking a talented and experienced Senior Power BI Developer to join our dynamic team. In this role, you will be responsible for transforming data into insightful and actionable business intelligence solutions using Microsoft Power BI. The ideal candidate will possess a strong analytical mindset, excellent problem-solving skills, and a passion for data visualization. Key Responsibilities. - Design, develop, and deploy interactive Power BI dashboards and reports that meet the needs of various stakeholders across the organization. - Collaborate with business analysts and clients to understand reporting requirements and translate them into technical specifications. - Optimize performance of Power BI reports and dashboard utilizing best practices in data modeling and visualization. - Integrate data from multiple sources, ensuring data accuracy and consistency for reporting purposes. - Conduct data analysis to identify trends, patterns, and insights, and provide recommendations to enhance business performance. - Provide training and support to end-users on Power BI tools and features, ensuring they maximize the benefits of the BI solutions. - Stay updated with the latest Power BI features and industry trends to enhance the team’s capabilities. - Document processes and provide ongoing maintenance and support for Power BI applications. Qualifications. - Bachelor’s degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in BI reporting and data analysis, with a strong focus on Power BI. - Proficiency in DAX, Power Query, and M language for data transformation. - Experience in building and optimizing data models in Power BI. - Strong understanding of data warehousing concepts and ETL processes. - Excellent analytical and problem-solving skills with attention to detail. - Experience with SQL and relational databases is highly preferred. - Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. - Relevant certifications (e.g., Microsoft Certified: Data Analyst Associate) are a plus. Why Join KPI Partners? - Work with a talented and passionate team committed to excellence in the field of analytics. - Opportunities for professional growth and skill development through training and mentorship. - Competitive salary and benefits package. - A dynamic and inclusive work environment that fosters collaboration and innovation. If you are a motivated and skilled Power BI developer looking to take your career to the next level, we would love to hear from you! Please submit your resume and cover letter detailing your relevant experience and why you are a great fit for KPI Partners.
Posted 4 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
Pune
Work from Office
What You'll Do We are seeking an experienced Lead Data Engineer with experience in the Data Engineering. We are looking for a background in ETL processes, data warehousing, data modeling, and hands-on expertise in SQL and Python. The ideal candidate will have exposure to cloud technologies and will play a key role in designing and managing scalable, high-performance data systems that support marketing and sales insights. You will report to Manager- Data engineering What Your Responsibilities Will Be You will Design, develop, and maintain efficient ETL pipelines using DBT,Airflow to move and transform data from multiple sources into a data warehouse. You will Lead the development and optimization of data models (e.g., star, snowflake schemas) and data structures to support reporting. You will Leverage cloud platforms (e.g., AWS, Azure, Google Cloud) to manage and scale data storage, processing, and transformation processes. You will Work with business teams, marketing, and sales departments to understand data requirements and translate them into actionable insights and efficient data structures. You will Use advanced SQL and Python skills to query, manipulate, and transform data for multiple use cases and reporting needs. You will Implement data quality checks and ensure that the data adheres to governance best practices, maintaining consistency and integrity across datasets. You will Experience using Git for version control and collaborating on data engineering projects. What You'll Need to be Successful Bachelor's degree with 6+ years of experience in Data Engineering. ETL/ELT Expertise : experience in building, improving ETL/ELT processes. Data Modeling : experience with designing and implementing data models such as star and snowflake schemas, and working with denormalized tables to optimize reporting performance. Experience with cloud-based data platforms (AWS, Azure, Google Cloud) SQL and Python Proficiency : Advanced SQL skills for querying large datasets and Python for automation, data processing, and integration tasks. DBT Experience : Hands-on experience with DBT (Data Build Tool) for transforming and managing data models. Good to have Skills: Familiarity with AI concepts such as machine learning (ML), (NLP), and generative AI. Work with AI-driven tools and models for data analysis, reporting, and automation. Oversee and implement DBT models to improve the data transformation process. Experience in the marketing and sales domain, with lead management, marketing analytics, and sales data integration. Familiarity with business intelligence reporting tools, Power BI, for building data models and generating insights.
Posted 4 weeks ago
5.0 - 7.0 years
20 - 27 Lacs
Bengaluru
Work from Office
MicroStrategy Developer to design and develop reports dashboards and analytical solution Responsibilities include collaborating with stakeholders data modeling writing SQL performance tuning and providing technical support within the MicroStrategy
Posted 4 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Description: Hiring Data Engineer with AWS or GCP Cloud Requirements: Role Summary: The Data Engineer will be responsible for designing, implementing, and maintaining the data infrastructure and pipelines necessary for AI/ML model training and deployment. They will work closely with data scientists and engineers to ensure data is clean, accessible, and efficiently processed Required Experience: • 6-8 years of experience in data engineering, ideally in financial services. • Strong proficiency in SQL, Python, and big data technologies (e.g., Hadoop, Spark). • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions. • Familiarity with ETL processes and tools. • Knowledge of data governance, security, and compliance best practices. Job Responsibilities: Key Responsibilities: • Build and maintain scalable data pipelines for data collection, processing, and analysis. • Ensure data quality and consistency for training and testing AI models. • Collaborate with data scientists and AI engineers to provide the required data for model development. • Optimize data storage and retrieval to support AI-driven applications. • Implement data governance practices to ensure compliance and security. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 1 month ago
3.0 - 7.0 years
25 - 27 Lacs
Bengaluru
Work from Office
We are seeking a skilled SAP BODS Developer to design, develop, and maintain data integration and ETL processes using SAP BusinessObjects Data Services.
Posted 1 month ago
3.0 - 6.0 years
7 - 11 Lacs
Noida, Hyderabad, Gurugram
Work from Office
Data Engineer Full Time - Gurgaon/Noida/Hyderabad Role and Responsibilities Develops and operationalizes data pipelines to make data available for consumption (BI, Advanced analytics, Services). Works in tandem with data architects and data/BI engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration. Designs, develops, and implements ETL/ELT processes using Azure services such as Azure Data Bricks, Azure SQL Synapse, ADLS etc. to improve and speed up delivery of our data products and services. Implement solutions by developing scalable data processing platforms to drive high-value insights to the organization. Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery. Identifies ways to improve data reliability, efficiency, and quality of data management. Communicates technical concepts to non-technical audiences both in written and verbal form. If Lead - Then performs peer reviews for other data engineer work. Skills Good Understanding of Data integration : Onboarding and integration of data from external and internal data sources through API management, sftp processes and others using synapse pipelines Deep expertise of core data platforms : Azure, Data Lakehouse design, big data concept using spark architecture Strong knowledge: With Integration technologies: pySpark, Python, ADF, Databricks With conceptual, logical, and physical database modeling. T-SQL knowledge and experience working with relational databases, query authoring, stored procedure development, debug, and optimize SQL queries. Proven success as a technical lead and individual contributor Familiarity with Project management methodologies: Agile, DevOps. Qualification Bachelors degree (or equivalent) in computer science, information technology, engineering, or related discipline Experience in building or maintaining ETL processes Professional certification
Posted 1 month ago
6.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Diverse Lynx is looking for ETL Test Engineer to join our dynamic team and embark on a rewarding career journey Developing ETL test cases and test plans that ensure data quality, accuracy, and completeness. Conducting functional and non-functional testing of ETL processes to validate the integrity of the data being transferred. Identifying and documenting defects, issues, and potential improvements in the ETL process and sharing them with the development team. Creating and maintaining ETL test environments that simulate production environments for testing purposes. Conducting load testing to measure the scalability and performance of ETL processes under different workloads. Conducting regression testing to ensure that changes made to ETL processes do not introduce new defects or issues. Developing and maintaining test automation scripts to improve the efficiency of ETL testing. To perform the role of an ETL Test Engineer effectively, candidates should possess strong analytical, problem-solving, and communication skills, as well as experience with ETL testing tools and technologies, such as SQL, ETL testing frameworks, and test automation tools.
Posted 1 month ago
2.0 - 6.0 years
5 - 6 Lacs
Noida
Work from Office
Diverse Lynx is looking for Power BI Developer to join our dynamic team and embark on a rewarding career journey Responsible for designing, developing, and implementing business intelligence solutions using Power BI, a data visualization and reporting tool from Microsoft Connecting to and integrating data from various sources, including databases, spreadsheets, and cloud services Designing and creating data models, dashboards, reports, and other data visualizations Enhancing existing Power BI solutions to meet evolving business requirements Collaborating with stakeholders to understand their data needs and requirements Building and maintaining data pipelines and ETL processes to ensure data quality and accuracy Developing and implementing security and access control measures to ensure the protection of sensitive data Troubleshooting and resolving issues with Power BI solutions Documenting and communicating solutions to stakeholders Excellent communication, analytical, and problem-solving skills
Posted 1 month ago
3.0 - 7.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: Senior ODI Developer Company: KPI Partners Location: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Pune, Maharashtra, India Job Description: KPI Partners is seeking a highly skilled Senior ODI Developer to join our dynamic team. The ideal candidate will play a crucial role in the design, development, and implementation of Oracle Data Integrator (ODI) solutions that meet our clients' requirements. This position offers an exciting opportunity to work on diverse projects and contribute to our clients' success in data integration and management. Key Responsibilities: - Design, develop, and maintain data integration processes using Oracle Data Integrator. - Collaborate with cross-functional teams to gather requirements and translate them into effective data integration solutions. - Optimize ODI mappings and processes for enhanced performance and efficiency. - Troubleshoot and resolve issues related to data extraction, transformation, and loading (ETL) processes. - Create and maintain documentation for data integration workflows, including design specifications, technical documentation, and user guides. - Ensure data quality and integrity throughout the integration process. - Mentor and provide guidance to junior team members on best practices in ODI development. - Stay updated with the latest trends and advancements in data integration technologies and ODI features. Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - 5+ years of experience working as an ODI Developer with a strong background in ETL processes. - Proficiency in Oracle Data Integrator and related tools. - Strong SQL skills and experience with relational databases. - Good understanding of data warehousing concepts and methodologies. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to collaborate effectively with team members and stakeholders. Preferred Qualifications: - Experience with Oracle databases and other ETL tools. - Familiarity with cloud-based data integration solutions. - Good to have experience in OBIA/BI Apps What We Offer: - Competitive salary and benefits package. - Opportunities for professional development and career advancement. - A collaborative and innovative work environment. If you are a motivated and experienced Senior ODI Developer looking to make a significant impact in a growing company, we encourage you to apply. Join KPI Partners and be part of our journey to deliver exceptional data integration solutions.
Posted 1 month ago
5.0 - 7.0 years
15 - 20 Lacs
Thiruvananthapuram
Work from Office
Role Proficiency: JD for SAP BODS Data Engineer Strong proficiency in designing, developing, and implementing robust ETL solutions using SAP Business Objects Data Services (BODS). with strong EDW experience Strong proficiency in SAP BODS development, including job design, data flow creation, scripting, and debugging. Design and develop ETL processes using SAP BODS to extract, transform, and load data from various sources. Create and maintain data integration workflows, ensuring optimal performance and scalability. Solid understanding of data integration, ETL concepts, and data warehousing principles. Proficiency in SQL for data querying and manipulation. Familiarity with data modeling concepts and database systems. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills for effective collaboration. Ability to work independently and manage multiple tasks simultaneously. 3+ experience relevant ETL development (SAPBODS) Required Skills Data Warehousing,Sap Bods,Etl,Edw
Posted 1 month ago
4.0 - 9.0 years
10 - 18 Lacs
Bengaluru
Work from Office
SUMMARY Job Role: Snowflake Data Warehouse Location Bangalore Experience 4+ years Must-Have The candidate should have a minimum of 3 years of relevant experience in Snowflake Data Warehouse. Responsibilities Analyze, design, code, and test multiple components of application code across one or more clients. Perform maintenance, enhancements, and/or development work, contributing to the overall success of the projects. Perform independently and become an SME. Actively participate/contribute in team discussions. Provide solutions to work-related problems. Collaborate with team members to analyze, design, and develop software solutions. Write clean, maintainable, and efficient code following best practices. Participate in code reviews and provide constructive feedback to peers. Troubleshoot, debug, and resolve technical issues. Stay updated on emerging technologies and apply them to projects. Professional & Technical Skills Proficiency in Snowflake Data Warehouse. Strong understanding of ETL processes and data modeling. Experience with cloud-based data platforms like AWS or Azure. Hands-on experience with SQL and database management systems. Knowledge of data warehousing concepts and best practices. Additional Information This position is based at our Bengaluru office. A 15 years full-time education is required. Requirements Requirements: Minimum of 3 years of experience in Snowflake Data Warehouse.
Posted 1 month ago
6.0 - 11.0 years
7 - 10 Lacs
Noida, New Delhi, Delhi / NCR
Work from Office
Job Title: Senior Executive Business Analyst Location: Noida Experience: Minimum 6 Years Shift Timing: 8:00 AM 5:00 PM Key Responsibilities: Develop Reporting Suites: Build and maintain standardized and customized reporting frameworks aligned with business goals and client needs. Standardize Reporting Formats: Define consistent metrics, data sources, and visualization styles across brands. Customize Reports: Collaborate with stakeholders to deliver tailored reports that support decision-making. Establish Reporting Infrastructure: Implement tools and platforms for self-service reporting and data access. Ensure Data Accuracy: Develop validation and QA processes to maintain data integrity. Cross-functional Collaboration: Work with marketing, sales, analytics, and IT to streamline data inputs and reporting workflows. User Support & Training: Provide training and ongoing support to users for effective utilization of reporting tools. Monitor & Optimize Performance: Track usage and feedback to enhance report quality and relevance. Upgrade Reporting Tools: Identify and implement improvements in reporting platforms and technologies. Client Reporting Standards: Create standardized, visually impactful reports capturing key metrics across all brands. Drive Innovation: Promote new ideas, explore advanced visualization methods, and adopt emerging technologies. Data Analysis & Insights: Translate data into actionable insights to support strategic business decisions. Ensure Compliance: Align reporting practices with ISO standards and health, safety, and environmental guidelines. Qualifications: Bachelors degree in Business Administration, Data Analytics, Computer Science, or a related field. Minimum 6 years of experience in constructing and managing reporting suites, preferably in a multi-brand or agency environment. Proficiency in reporting and data visualization tools such as Tableau, Power BI, or equivalent. Strong understanding of data structures, data governance, and data quality management. Excellent analytical, problem-solving, and communication skills. Ability to manage multiple stakeholders and deliver accurate, timely reports in a fast-paced environment. Interested candidates share their updated resumes at pooja.thapa@manpowergroup.com.
Posted 1 month ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Summary: We are seeking a highly skilled and experienced Tableau Developer to join our Development department. The ideal candidate will have 6-10 years of experience in developing Tableau solutions. As a Tableau Developer, you will be responsible for designing, developing, and maintaining Tableau dashboards and reports that provide valuable insights to our organization. Roles and Responsibilities: - Collaborate with business stakeholders to understand their reporting and data visualization requirements. - Design and develop Tableau dashboards and reports that effectively present complex data in a visually appealing and user-friendly manner. - Create and maintain data models, data sources, and data connections within Tableau. - Perform data analysis and validation to ensure the accuracy and integrity of the data used in Tableau dashboards and reports. - Optimize Tableau performance by identifying and implementing improvements in data processing and visualization techniques. - Troubleshoot and resolve issues related to Tableau dashboards and reports. - Stay up-to-date with the latest Tableau features and functionalities and propose innovative solutions to enhance our reporting capabilities. - Collaborate with cross-functional teams to integrate Tableau dashboards and reports with other business systems and applications. - Provide training and support to end-users on Tableau functionality and best practices. Qualifications: - Bachelor's degree in Computer Science, Information Systems, or a related field. - 6-10 years of experience as a Tableau Developer or in a similar role. - Experience in connecting Tableau to various databases such as SQL Server, Oracle, MySQL, and cloud-based data sources. - Strong proficiency in Tableau Desktop and Tableau Server. - In-depth knowledge of data visualization best practices and principles. - Proficiency in SQL and data modeling . - Experience with ETL processes and data warehousing concepts. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills. - Ability to work independently and manage multiple priorities in a fast-paced environment.
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Kolkata
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelors degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 1 month ago
8.0 - 13.0 years
3 - 6 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Technical Skills: Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France