Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Analyst specializing in Power BI and Python, you will be an integral part of our dynamic data analytics team in Bangalore. With 2-4 years of experience, your role will involve analyzing complex data sets, creating interactive visualizations, and generating actionable insights to support data-driven decision-making. Your responsibilities will include analyzing data to uncover trends and patterns, utilizing Python for data cleaning and advanced analysis, and developing and maintaining Power BI dashboards to visualize key performance indicators (KPIs) and metrics. You will collaborate with business units to understand their data requirements and deliver tailored data solutions, ensuring data accuracy and integrity through regular quality checks. In addition to your technical skills in Power BI, Python, SQL, and database management, you will need to have strong analytical and problem-solving abilities. Effective communication and teamwork skills are essential as you work closely with cross-functional teams to provide data-driven solutions. Continuous improvement and staying updated on the latest trends in data analytics and visualization will be key to your success in this role. To qualify for this position, you should have a Bachelor's degree in Data Science, Computer Science, Statistics, or a related field, along with 2-4 years of relevant experience. Certifications in data analytics are a plus. Your proven track record of working with large data sets and your ability to manage multiple tasks in a fast-paced environment will be highly valued. If you are detail-oriented, proactive, and passionate about leveraging data to drive business outcomes, we invite you to join our team and contribute to the development of data-driven strategies that will shape our future success.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11 PM IST. The specific schedule will be determined and communicated by direct management.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a leading consulting firm, we are currently seeking Professionals in OneStream practice to join our dynamic team. This role is ideal for an experienced Professional who is eager to make a significant impact by enhancing and optimizing financial planning, forecasting, and business processes through the power of OneStream. You will play a key role in OneStream model solutioning and implementations, optimizing business planning processes, and collaborating with stakeholders to deliver effective planning solutions. This position offers hands-on experience and opportunities for professional growth in the enterprise performance management (EPM) and planning ecosystem. Location: PAN India Responsibilities: - Implement OneStream Solutions covering Requirements and Design, Development, Testing, Training, and support. - Assist in pre-sales meetings with prospective clients, including supporting client demos and proof-of-concept projects. - Collaborate seamlessly with internal and client-side resources and effectively communicate across various audiences. - Demonstrate proficiency in Anaplan, understanding of multi-dimensional modeling, and basic knowledge of Excel, data integration tools, or ETL processes is a plus. - Approach problems creatively and utilize technology to solve business challenges. - Adhere to clients" delivery methodology and project standards, ensuring timely completion of project deliverables. - Thrive in a fast-paced, dynamic environment and navigate ambiguity. - Embrace the culture of "All Business is personal" and take full ownership of tasks by adopting an outcome-driven strategy. Qualifications: - Educational Background: Bachelor's degree in finance, Accounting, Business, Computer Science, or a related field, or Chartered Accountant / MBA Finance. - 3+ Years of OneStream experience and a total of 5+ Years of EPM implementations. - Certified OneStream Professional. - Proficiency in OneStream, understanding of multi-dimensional modeling, and basic knowledge of Excel, data integration tools, or ETL processes is a plus. - Good understanding of financial and accounting processes (account reconciliations, intercompany eliminations, currency translation, allocations, and top-side adjustment), including proficient experience with financial close, consolidations, financial reporting, FP&A. - Experience with data integration between different systems/sources, REST API as an added advantage. Preferred Skills: - Strong client-facing skills, organizational, and detail-oriented. - Excellent communication and interpersonal skills. - Proven ability to work in a demanding, fast-paced environment and manage a high workload. - Familiarity with data visualization tools like Oracle. - Experience with data visualization tools like Tableau or PowerBI.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help the retail business make data-driven decisions. You will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Key Responsibilities: - Leverage Retail Knowledge: Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. - Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. - Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. - Use AI-driven techniques for personalization, demand forecasting, and fraud detection. - Utilize advanced statistical methods to optimize existing use cases and build new products to serve new challenges and use cases. - Stay updated on the latest trends in data science and retail technology. - Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills: - Strong analytical and statistical skills. - Expertise in machine learning and AI. - Experience with retail-specific datasets and KPIs. - Proficiency in data visualization and reporting tools. - Ability to work with large datasets and complex data structures. - Strong communication skills to interact with both technical and non-technical stakeholders. - A solid understanding of the retail business and consumer behavior. - Programming Languages: Python, R, SQL, Scala - Data Analysis Tools: Pandas, NumPy, Scikit-learn, TensorFlow, Keras - Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn - Big Data Technologies: Hadoop, Spark, AWS, Google Cloud - Databases: SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Job Title: Retail Specialized Data Scientist - Management Level: 09 - Consultant - Location: Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata - Company: Accenture This position requires a solid understanding of retail industry dynamics, strong communication skills, proficiency in Python for data manipulation, statistical analysis, and machine learning, as well as familiarity with big data processing platforms and ETL processes. The Retail Specialized Data Scientist will be responsible for gathering, cleaning, and analyzing data to provide valuable insights for business decision-making and optimization of pricing strategies based on market demand and customer behavior.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
We are looking for an experienced Data Analytics Engineer specializing in Snowflake with a minimum of 4 years of experience. As an integral member of our data engineering team, you will be responsible for the design, development, and maintenance of data pipelines, working with large datasets in Snowflake. The ideal candidate will also possess skills in PowerBI for data visualization and reporting. You will assist in building, optimizing, and scaling data workflows, contributing to the improvement of data architecture across the organization. Key Responsibilities: - Design & Develop Data Warehouses: Architect and maintain Snowflake Data Warehouses, ensuring optimal performance, scalability, and security. - ETL Development: Build, implement, and manage ETL pipelines to extract, transform, and load data from multiple sources into Snowflake environments. - Schema Design: Create and maintain Snowflake schemas, tables, views, and materialized views to support a wide range of business use cases. - Optimization & Tuning: Perform query optimization, data partitioning, and other techniques to enhance performance and reduce costs. - Collaboration: Work closely with Data Engineers, Data Scientists, and Business Analysts to understand data needs, share insights, and improve data flows. - Data Integration & Automation: Automate data integration workflows, repetitive tasks, and report generation, improving operational efficiency. - Data Governance & Security: Ensure data integrity, security, and compliance with established data governance policies and industry best practices. - Troubleshooting & Issue Resolution: Identify and resolve performance issues or anomalies within the Snowflake environment, ensuring smooth operations. - Platform Upgrades & Migration: Assist in upgrading Snowflake platforms and data migration efforts, maintaining minimal disruptions. Required Skills: - 4+ years of experience with Snowflake Data Warehouse development and management. - Proficiency in SQL and SnowSQL, with a deep understanding of ETL processes and data integration techniques. - Strong expertise in data modeling, schema design, and performance optimization within the Snowflake platform. - Experience working with cloud-based platforms such as AWS, Azure, or GCP. - Familiarity with data pipeline orchestration tools like dbt, Airflow, or similar. - Strong problem-solving skills and ability to optimize complex queries and workflows. - Solid understanding of data governance, security protocols, and best practices in a cloud-based environment. Preferred Skills: - Certification in Snowflake or other relevant cloud technologies. - Familiarity with PowerBI for data visualization, reporting, and dashboard creation. - Knowledge of additional programming languages like Python or Java for enhancing data pipelines. - Experience working with data visualization platforms such as Tableau, Looker, or others. - Exposure to data lakes and advanced data integration techniques. If you're a passionate ETL Developer looking for the next step in your career, join us and make an impact on the data landscape of our organization!,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Elait is a digital data management and cloud solutions provider based in Bengaluru, offering innovative solutions across various industry verticals using leading technologies such as Ab Initio, Microsoft, and Snowflake. Our consultants are experts in data governance, architecture, high-volume data processing, data integration, and more. We have a rich catalog of accelerators and frameworks that guide our customers in data engineering, enabling them to work in agile environments with quicker delivery timelines and cost reductions. As a Metadata Model Developer at Elait, your responsibilities will include designing, developing, and implementing metadata models within the Ab Initio MD Hub environment. You will configure and customize MD Hub components, build custom extractors for data integration, implement efficient data processing solutions using Ab Initio tools, and ensure data quality and accuracy within the metadata repository. Additionally, you will support data governance initiatives through the implementation of data quality rules, data lineage tracking, and data classification systems, while also troubleshooting and resolving metadata-related issues. The ideal candidate for this role should have 3-6 years of hands-on experience with Ab Initio tools such as GDE (Graphical Development Environment), Express>It, MDH (Metadata Hub), and PDL (Parallel Data Language). Proficiency in MD Hub features and customization, experience in building custom data extractors, a strong understanding of data modeling and ETL processes, knowledge of data governance frameworks, and familiarity with DPDP compliance and GDPR are preferred. A Bachelor's or Master's degree in Computer Science, Engineering, or a related field is required for this position.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Product Manager at Blackhawk Network, you will be responsible for building, developing, and managing the product roadmap for Core data services and features. Your role will involve leading Blackhawk towards data democratization and self-service capabilities. You will collaborate with internal experts and clients to produce intelligent products, create holistic solutions, and partner with technical program managers and engineers to deliver on the product roadmap. From concept to launch, you will own the process, ensuring proper prioritization of product features in collaboration with BHN business teams. You will write user stories, create features, and maintain a healthy backlog of items to be worked on. By leveraging a deep understanding of the competitive landscape, you will identify distinctive advantages and apply them in our solutions. Your responsibilities will also include writing detailed feature specifications based on product requirements, translating complex problems into generic product designs, and defining Core Data capabilities at BHN with scalable features supporting enterprise reporting, dashboards, and analytics. Working with development teams, you will ensure that development stays on track, meets requirements, and release artifacts are prepared on time for product release deadlines. To qualify for this role, you should have a BA/BS in Computer Science or a related technical field, or equivalent practical experience. You should have at least 3 years of Enterprise Data Management and Product Management experience, including managing end-to-end product launches, go-to-market strategies, and competitive landscape analysis. Strong knowledge and significant experience with enterprise BI systems, relational databases, data warehouses, analytics, real-time data processing, big data platforms, ETL processes, SQL, BI and Visualization tools, system integrations, APIs, ETLs, and agile methodology are essential. Additionally, you should possess excellent problem-solving skills, attention to detail, written and oral communication skills, organizational and analytical skills, and technical abilities. A positive attitude, collaborative approach, adaptability in a high-growth environment, and experience executing Data Governance best practices are highly valued in this role. If you are a team player who thrives in a rapidly changing environment, can manage and influence others in a matrixed setting, and have a track record of success in product management, this opportunity at Blackhawk Network might be the perfect fit for you.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
ahmedabad, gujarat
On-site
As an accomplished Lead Data Engineer with 7 to 10 years of experience in data engineering, we are looking for you to join our dynamic team in either Ahmedabad or Pune. Your expertise in Databricks will play a crucial role in enhancing our data engineering capabilities and working with advanced technologies, including Generative AI. Your key responsibilities will include leading the design, development, and optimization of data solutions using Databricks to ensure scalability, efficiency, and security. You will collaborate with cross-functional teams to gather and analyze data requirements, translating them into robust data architectures and solutions. Developing and maintaining ETL pipelines, leveraging Databricks, and integrating with Azure Data Factory when necessary will also be part of your role. Furthermore, you will implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensuring data quality, governance, and security practices are adhered to will be essential to maintain the integrity and reliability of data solutions. Providing technical leadership and mentorship to junior engineers to foster an environment of learning and growth will also be a key aspect of your role. It is crucial to stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Your proven expertise in building and optimizing data solutions using Databricks, integrating with Azure Data Factory/AWS Glue, proficiency in SQL, and programming languages like Python or Scala are essential. A strong understanding of data modeling, ETL processes, Data Warehousing/Data Lakehouse concepts, cloud platforms (particularly Azure), and containerization technologies such as Docker are required. Excellent analytical, problem-solving, and communication skills are a must, along with demonstrated leadership ability and experience mentoring junior team members. Preferred qualifications include experience with Generative AI technologies and applications, familiarity with other cloud platforms like AWS or GCP, and knowledge of data governance frameworks and tools. In return, we offer flexible timings, 5 days working week, a healthy environment, celebrations, opportunities to learn and grow, build a community, and medical insurance benefits. Join us and be part of a team that values innovation, collaboration, and professional development.,
Posted 1 month ago
5.0 - 15.0 years
0 Lacs
karnataka
On-site
As the Head of Delivery Management in our organization, you will play a crucial role in leading our delivery operations with a focus on Data Engineering and Data Analytics. Your primary responsibility will be to oversee the end-to-end execution of projects related to data pipelines, analytics platforms, and data-driven solutions. Your expertise in managing projects, optimizing delivery processes, and fostering continuous improvement will be essential in working collaboratively with cross-functional teams comprising data scientists, analysts, and engineers. Your key responsibilities will include leading and overseeing delivery teams, developing strategies for data-centric project delivery, ensuring successful delivery of data solutions, monitoring delivery performance, and collaborating with teams to address challenges in data architecture, integration, and scalability. You will be required to drive continuous improvement in processes, methodologies, and tools tailored to data projects, maintain strong client and stakeholder relationships, and ensure adherence to best practices in data security, privacy, and compliance. Effective resource management, fostering a culture of innovation, collaboration, and accountability within the delivery team will also be important aspects of your role. To be successful in this position, you should have a minimum of 15 years of experience in delivery management, with at least 5 years specifically in Data Engineering or Data Analytics domains. Your proven track record in delivering large-scale data projects involving ETL processes, cloud platforms, or data warehouses, along with a strong understanding of data architecture, big data technologies, and analytics frameworks will be highly valuable. Exceptional leadership and team management skills, excellent project management abilities with exposure to agile methodologies, and familiarity with tools like Tableau, Power BI, Snowflake, Hadoop, or similar platforms are essential requirements. Moreover, your strong analytical and problem-solving skills, experience with financial planning and resource management in data projects, deep understanding of industry trends in data and analytics, and proven ability to drive stakeholder alignment and ensure delivery excellence will set you up for success in this role. If you are passionate about leading teams and delivering excellence in data-driven initiatives, we welcome you to bring your expertise to our team and contribute to our mission of driving innovation and success in the data engineering and analytics space.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Analytics Manager specializing in Power BI, Python, Tableau, and SQL within the Insurance Domain, you will be responsible for designing, developing, and implementing Power BI dashboards. Your expertise in data warehousing, ETL processes, and data governance will be crucial for this role. Your key responsibilities will include leading, mentoring, and developing a team of data analysts and data scientists. You will provide strategic direction to ensure the timely delivery of analytical projects. Additionally, you will define and implement the company's data analytics strategy, collaborate with stakeholders to understand data needs, conduct complex data analysis, and translate findings into strategic recommendations. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights accessible to technical and non-technical stakeholders. Ensuring data integrity across systems, enforcing data governance policies, and collaborating with cross-functional teams will also be part of your role. To qualify for this position, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field. With 10+ years of experience in data analysis, including at least 2 years in a managerial role, you should be proficient in SQL, Python, R, Tableau, and Power BI. Strong knowledge of data modeling, ETL processes, and database management is essential, along with exceptional problem-solving skills and the ability to communicate technical concepts to non-technical stakeholders. Experience in managing and growing data professional teams, strong project management skills, and domain knowledge in insurance are desirable. Staying updated with the latest data analytics trends and technologies and leading data-driven projects from initiation to execution will be key aspects of this role.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. Your unique voice and perspective are valued to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. As a Data Systems Project Manager within The Mercury Support Team (MST), you will be responsible for overseeing the management and delivery of multiple data-oriented projects and enhancements within the organization. Your role will involve planning, executing, monitoring, and controlling data projects to ensure they are completed on time, within budget, and to the specified scope. The ideal candidate will have extensive experience in data management, strong project management skills, and the ability to collaborate effectively with cross-functional teams. You will work closely with the vendor Delivery team to manage day-to-day work, including the delivery of analysis, design, build test, and deploy. Additionally, you will be responsible for tailoring existing methodologies for smaller Projects and working with the Core Business Services IT PMO. Your responsibilities will also include managing and tracking the development end-to-end, reporting progress, risks, issues, and performance metrics periodically. Key Responsibilities: - Responsible for Project Management, delivery, and Governance activities by the MST Delivery team - Plan and execute Reporting & Analytics Support requirements within agreed schedule, budget, and scope - Follow the Application Development lifecycle process and relevant project management processes - Monitor and control work to ensure project/development remains on track and in control - Collaborate with cross-functional and outside MST teams to ensure alignment to overall business and technology strategies - Manage external contracts and suppliers where required - Develop objectives, phasing, and content of the project/work stream to deliver business case benefits Experience and Skills Requirements: - Bachelor's degree in Computer Science, Information Technology, Data Management, or related field - 8-10 years of IT project management experience, including 2-4 years in data systems or development management roles - Proven track record in managing large, complex data projects - Strong knowledge of data management tools and technologies - Excellent project management skills, with a track record of delivering projects on time and within budget - Strong analytical and problem-solving skills - Excellent verbal and written communication skills in English - Ability to work across global time zones and manage virtual, cross-cultural teams Certification Requirements: - Desired Project Management Practitioner (PMP or Prince 2 certification) or equivalent experience Supervision Responsibilities: You should meet regularly with the process manager and process SMEs to maintain alignment of purpose and direction. This role requires the ability to think quickly and make sound decisions without constant supervision. Other Requirements: Flexibility and ability to work virtually across global time zones. Education: 3-4 year college degree in a related technology field or comparable job experience. Join EY to contribute to building a better working world, creating long-term value for clients, people, and society, and building trust in the capital markets. Work with diverse teams in over 150 countries to provide trust through assurance and help clients grow, transform, and operate effectively.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
We are looking for a highly skilled Senior Developer with 6 to 9 years of experience to be a part of our team. The ideal candidate should have extensive expertise in Azure Data Factory and will be working in a hybrid model during day shifts. This position does not involve any travel. Your main responsibility will be to contribute to the development and optimization of data integration solutions, ensuring seamless data flow and transformation. Your primary responsibilities will include developing and implementing data integration solutions using Azure Data Factory, overseeing the design, build, and deployment of data pipelines, and ensuring data accuracy and integrity throughout the ETL process. You will collaborate with cross-functional teams to understand data requirements, optimize data workflows for performance and scalability, and monitor and troubleshoot data pipeline issues. Providing technical guidance and support to junior developers, maintaining documentation for data processes and solutions, and implementing best practices for data security and compliance will also be part of your role. You will conduct code reviews to ensure quality and adherence to standards, participate in project planning and estimation activities, and continuously improve processes and solutions based on feedback. Staying updated with the latest trends and technologies in data integration is essential for this role. To qualify for this position, you should have a strong understanding of Azure Data Factory and its components, proficiency in ETL processes and data pipeline development, experience with data modeling and database design, and the ability to troubleshoot and resolve data-related issues. Strong analytical and problem-solving skills, excellent communication and collaboration abilities, familiarity with data security and compliance best practices, and experience with other Azure services will be beneficial. Knowledge of programming languages such as Python or SQL is desirable. You should be able to work in a hybrid model and adapt to changing requirements, possess strong attention to detail and a commitment to quality, and have a proven track record of delivering successful data integration projects. A Bachelor's degree in Computer Science, Information Technology, or a related field is required.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You should have strong experience in Python programming and be familiar with Databricks and SQL databases. Your responsibilities will include database performance tuning and optimization, working with the Databricks platform for big data processing and analytics, developing and maintaining ETL processes using Databricks notebooks, and implementing data pipelines for transformation and integration. Additionally, you will design, develop, test, and deploy high-performance and scalable data solutions using Python.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Tableau Developer, you will be responsible for creating and managing dashboards, reports, and visualizations using Tableau Desktop and Tableau Server. With a total experience of 4-6 years and at least 3 years of relevant experience, you should have strong proficiency in publishing and scheduling reports on Tableau Server. Additionally, your expertise in SQL and Data Manipulation will be crucial, as you will be required to write complex SQL queries, including joins, subqueries, and window functions, and handle data extraction, transformation, and loading (ETL) processes. Hands-on experience with Snowflake for data warehousing and analytics is essential for this role. You should be able to integrate Snowflake with other data sources and tools, as well as have familiarity with data warehousing concepts like star and snowflake schemas, and experience in designing and implementing ETL workflows. Your analytical and problem-solving skills will be put to the test as you interpret complex data sets and derive actionable insights, troubleshoot data-related issues, and communicate effectively with non-technical stakeholders. It would be beneficial if you have a basic knowledge of Python for data analysis and automation, an understanding of statistical analysis and data modeling techniques, and experience with programming languages like Python or R for data analysis and visualization. Your responsibilities will include developing impactful visualizations using Tableau, utilizing advanced SQL techniques, working with different databases for data integration, writing custom SQL queries, troubleshooting and debugging data-related issues, translating business requirements into workable data models and visualizations, implementing best practices for Tableau report development, performing ad-hoc analysis, providing training and support to end-users, and staying updated with the latest Tableau features and data visualization trends. If you are looking to excel in a role that combines technical expertise with communication skills, problem-solving abilities, and a continuous learning mindset, this Tableau Developer position offers a challenging yet rewarding opportunity for you to make a significant impact within the organization.,
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
kochi, kerala
On-site
As a Business Intelligence Developer, your primary responsibility will be designing and implementing SSAS Tabular Models to meet the reporting and analytics requirements of the organization. You will be tasked with developing complex DAX queries aimed at enhancing data models and optimizing performance. Collaboration with stakeholders to gather requirements and converting them into technical specifications will be a crucial aspect of your role. Ensuring data integrity and accuracy through meticulous testing and validation processes will be essential to maintain the quality of BI solutions. Additionally, providing continuous support and maintenance for existing BI solutions will be part of your ongoing duties. It will be expected of you to stay abreast of the latest trends and technologies in the field of Business Intelligence to consistently enhance and evolve the solutions provided. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related discipline. A minimum of 3-8 years of experience in business intelligence development is required. Proficiency in SSAS Tabular Model and DAX is essential for this position. A strong grasp of data warehousing concepts, ETL processes, and experience with BI tools and reporting solutions will be advantageous in successfully fulfilling the responsibilities of this role.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You should have a minimum of 8+ years of hands-on experience in designing, developing, and delivering advanced analytics solutions using Power BI. Your role will involve deep technical expertise in data modeling, ETL architecture, and enterprise-grade reporting frameworks. You must possess proven expertise in data warehousing concepts and analytics architecture, with experience in Oracle ADW, Snowflake, and Microsoft Azure. In addition, you should have strong proficiency in Power BI Semantic data modeling, DAX, and visual storytelling, along with experience in working with large-scale datasets, cloud-based multi sources, and hybrid architectures. Advanced knowledge of Python for data manipulation and workflow automation is required, and proficiency in PL/SQL is preferred. You should also have a solid understanding of ETL processes, with the ability to produce high-quality specification documents using Oracle ODI, Informatica (IICS), or SSIS. Experience in Power Report/Dashboard building is essential, along with the ability to orchestrate and mentor Power BI developers effectively. Your responsibilities will include translating complex business requirements into scalable technical solutions using Power BI and related technologies, architecting and implementing semantic data models, leading the development of interactive dashboards and executive-level visualizations, designing and documenting ETL specifications, collaborating with ETL developers, overseeing Power BI development teams, integrating and modeling data from multiple API sources using Python, optimizing performance across Power BI reports and dataflows using advanced DAX expressions, and ensuring alignment between analytics architecture and enterprise data governance standards.,
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
???? We&aposre Hiring: Oracle Data Integrator Developer - Night Shift! ???? We are seeking an experienced Oracle Data Integrator Developer to join our dynamic team for night shift operations. The ideal candidate will have extensive expertise in ODI development, ETL processes, and data integration solutions to support our critical business operations during off-peak hours. Job Position: Oracle Data Integrator Developer Experience Required: Minimum 8+ Years Location: Remote/ Work From Home Job Type: Contract to hire (3 Months/ Renewable) Shift: Night Shift (9 PM to 6 AM) Notice Period: Immediate to 1 Week Max Mode of Interview: Virtual Requirements 7 + years experience in building function & store procedure using ODI tool. Design Data warehouse table and applicable documentation. Develop table creation scripts and execute the scripts for the Database Administrator (DBA). Analyze data sources and determine data relationships to pair with the business requirements documentation (BRD). Familiarity with data quality issues and Exact, Transform, Load (ETL) processes Design and develop Extract Load Transform (ELT) process and load data from source tables into staging tables and then into target fact and dimension tables utilizing Oracle Data Integrator (ODI). Validate data loaded into various tables utilizing Structured Query Language (SQL) and other analysis. Knowledge and understanding of the applicable tools and practices used in data governance including query and data profiling techniques Proven skills in data analysis including queries to calculate data quality. Solid written, verbal, and analytical skills Strong research and problem determination and solution skills Show more Show less
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Vertelo is an Integrated Fleet Electrification Solution Provider backed by Macquarie and Green Climate Fund. We aim to accelerate the transition of fleets to electric vehicles and build a robust EV ecosystem by offering bespoke solutions to customers including leasing and financing, charging infrastructure and energy solutions, fleet management services, and end of vehicle life management. Vertelo has received anchor investment from the Green Climate Fund which has committed to invest up to $US 200 million. Overall, Vertelo plans to invest $US 1.5 billion over 10 years with the aim to achieve a potential greenhouse gas emissions reduction of 9.5 MtCO2e. Read more on vertelo.in , https://macq.co/6005bjjdJ At Vertelo, we are now building up our IT organization and digital infrastructure to support the business growth plans and manage risk. To strengthen our team, we are seeking a versatile and talented Senior AWS Cloud and Data Engineer with Degree in Computer Science or Information Technology and 4-6 years of hands-on experience in cloud computing and data engineering roles. About the Role : This role will be responsible for managing our AWS environment, maintaining our data lake infrastructure, and supporting our fleet management and analytics platform. The ideal candidate will have a strong background in both AWS cloud services and data engineering, with the ability to bridge the gap between infrastructure management and data solutions. Key Responsibilities : AWS Environment Management: Manage and maintain AWS accounts, including creation of new accounts as needed Monitor and optimize AWS costs using Cost Explorer and other tools Implement and maintain AWS best practices for security, compliance, and performance Troubleshoot and resolve AWS-related issues Data Lake and Pipeline Management: Maintain and optimize data pipelines for ingesting fleet telematics and charging data Ensure data quality, consistency, and availability in the data lake Implement data governance policies and procedures Optimize data storage and retrieval processes Analytics and Visualization Support: Make minor modifications to Amazon QuickSight dashboards as per business requirements Support data analysts in creating new visualizations and reports Troubleshoot issues related to data access and visualization Customer Onboarding and Support: Assist in onboarding new customers to the fleet management solution Configure customer-specific data integrations and dashboards Provide technical support for customer inquiries related to data and analytics Continuous Improvement: Stay updated with the latest AWS services and data engineering best practices Propose and implement improvements to the existing architecture and processes Collaborate with the development team to integrate new features and capabilities Required Skills and Qualifications : Bachelor&aposs degree in Computer Science, Information Technology, or related field 4-5 years of experience in cloud computing and data engineering roles Strong proficiency in AWS services, especially in areas of account management, networking, security, and data services Experience with AWS Landing Zone or similar multi-account AWS architectures Solid understanding of data lake architectures and related AWS services (e.g., S3, Glue, Athena) Proficiency in at least one programming language (preferably Python) for scripting and data manipulation Experience with data pipeline tools and ETL processes Familiarity with data visualization tools, particularly Amazon QuickSight Strong problem-solving skills and ability to work independently Excellent communication skills, both written and verbal Preferred Qualifications: AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics) Experience in the automotive or fleet management industry Knowledge of data modeling and database design principles Experience with CI/CD pipelines and infrastructure as code (e.g., CloudFormation, Terraform) If you are a self-starter with a passion for cloud technologies and data engineering, capable of managing complex systems and adapting to evolving business needs in a dynamic environment, please send your updated CV to [HIDDEN TEXT] Show more Show less
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
The BI Data Engineer is a key role within the Enterprise Data team. We are looking for expert Azure data engineer with deep Data engineering, ADF Integration and database development experience. This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics and leading visualisation tools enabling the companys aim to become a fully digital organisation. Job Description: Key Responsibilities: Build Enterprise data engineering and Integration solutions using the latest Azure platform, Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Development of enterprise ETL and integration routines using ADF Evaluate emerging Data enginnering technologies, standards and capabilities Partner with business stakeholder, product managers, and data scientists to understand business objectives and translate them into technical solutions. Work with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure smooth deployment of data engiinering solutions Required Skills And Experience Technical Expertise : Expertise in the Azure platform including Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Exposure to Data bricks and lakehouse arcchitect & technologies Extensive knowledge of data modeling, ETL processes and data warehouse design principles. Experienc in machine learning and AI services in Azure. Professional Experience : 5+ years of experience in database development using SQL 5+ Years integration and data engineering experience 5+ years experience using Azure SQL DB, ADF and Azure Synapse 2+ Years experience using Power BI Comprehensive understanding of data modelling Relevant certifications in data engineering, machine learning, AI. Key Competencies: Expertise in data engineering and database development. Familiarity with the Microsoft Fabric technologies including One Lake, Lakehouse and Data Factory Strong understanding of data governance, compliance, and security frameworks. Proven ability to drive innovation in data strategy and cloud solutions. A deep understanding of business intelligence workflows and the ability to align technical solutions Strong database design skills, including an understanding of both normalised form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g., Kimball Data warehousing Experience in Cloud based data integration tools like Azure Data Factory Experience in Azure Dev Ops or JIRA is a plus Experience working with finance data is highly desirable Familiarity with agile development techniques and objectives Location: Pune Brand: Dentsu Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description: Responsibilities Requirements Management | Responsible for requirements management and the rough conception of the solutions assigned to them. In this context also responsible for the evaluation of the content and its functional requirements. Implementation responsibility for the application solutions | Responsible for the implementation and fulfillment of requirements of his solutions. Coordination of the corresponding work packages and coordination with the relevant professional and technical roles as well as the requesters. No responsibility for the content of the defined requirements. Application of architecture guidelines | Application of the adopted architecture guidelines for the overall architecture and the data lake. Support sharing a solution | Submits the completed solution to the Business Architect / board for approval and takes care of any adjustments and improvements. Participation in data governance | Consults the relevant data governance specialists when building the application/solution and ensures the implementation of the relevant data security and protection aspects as well as compliance with legal requirements. Procurement & commissioning of the infrastructure | Responsible for procuring the relevant infrastructure around the data lake. Also responsible for setting up and commissioning the necessary hardware. Support | Specialist for his solution and thus central point of contact and 2nd level support for power users / power users + his solution. Qualifications 8+ years of experience in Application Development using the following technologies HTML, CSS, JavaScript, JQuery, Bootstrap SQL, Stored Procedures, Triggers, Functions, Views Powershell or C# for Background Process Development Azure Cloud Platform Purview/Collibra or similar data catalog/governance application Experience with Low-Code/No-Code Platforms (like MS Power Apps), MS Power Automate, Visual Studio, Git, Azure DevOps is preferred Proficient in the Azure platform, including Azure Data Factory, Azure SQL Database, Azure Synapse, Azure Databricks, and Power BI. Good knowledge of data modeling, ETL processes, and data warehouse design principles. Very good technical know-how Good technology knowledge and data architecture know-how Basic knowledge of relevant data security and protection as well as legal requirements Proficient in project management, Use of own reports Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
At Capgemini Engineering, the global leader in engineering services, a team of engineers, scientists, and architects collaborates to support the world's most innovative companies in realizing their full potential. From cutting-edge technologies such as autonomous vehicles to life-saving robotics, our digital and software technology experts showcase creativity by offering unique R&D and engineering services across diverse industries. Join us for a rewarding career filled with endless opportunities where your contributions can truly make a difference, and each day brings fresh challenges. Design, develop, and maintain ETL processes utilizing Pentaho Data Integration (Kettle) to extract data from various sources like databases, flat files, APIs, and cloud platforms. Transform and cleanse data to align with business and technical requirements before loading it into data warehouses, data lakes, or other designated systems. Monitor and optimize ETL performance while troubleshooting any arising issues. Collaborate closely with data architects, analysts, and business stakeholders to comprehend data requirements. Uphold data quality, integrity, and security throughout the ETL lifecycle while documenting ETL processes, data flows, and technical specifications. For the Industrial Operations Engineering focus, develop expertise in the designated area. Share knowledge and provide guidance to peers, interpreting clients" needs effectively. Execute assigned tasks independently or with minimal supervision, identifying and resolving problems in straightforward scenarios. Contribute actively to teamwork and engage with customers to deliver value. Capgemini is a prominent global partner in business and technology transformation, aiding organizations in accelerating their journey towards a digital and sustainable future. With a diverse and responsible workforce of 340,000 members across 50+ countries, Capgemini leverages its 55+ years of experience to help clients harness technology's value comprehensively. Offering end-to-end services and solutions spanning from strategy and design to engineering, Capgemini excels in AI, generative AI, cloud, and data capabilities, supported by deep industry knowledge and a robust partner ecosystem.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The ideal candidate for the position should have a Bachelor's degree in Computer Science, Information Technology, or a related field and possess a minimum of 2 years of experience in Microsoft Dynamics Navision ERP & BI development and data analysis. It is essential for the candidate to demonstrate a strong proficiency in Power BI, including creating dashboards, reports, and data models, along with experience in SQL Server and MS SQL Server BI Stack tools and Power BI. The candidate should also have proven ability in analyzing data and extracting meaningful insights, as well as excellent communication and interpersonal skills. Additionally, the preferred qualifications for the role include experience with other ERP and Business Process and Power BI tools, data warehousing and ETL processes, and cloud-based BI solutions such as Azure Power BI or AWS QuickSight. Certification in Microsoft Dynamics Navision ERP would be a plus. The key responsibilities of the role include working on Microsoft Dynamics Navision ERP processes and customization, creating dashboards and interactive visual reports using Power BI, defining key performance indicators (KPIs) and tracking them regularly, analyzing data for decision-making, creating, testing, and deploying Power BI scripts, running DAX queries and functions, creating data documentation, constructing a data warehouse, optimizing BI systems, and utilizing filters and visualizations for data understanding. In addition to the technical qualifications, the ideal candidate should have a strong understanding of business processes and a passion for data-driven decision-making. They should be capable of working independently and collaboratively, and be effective in communicating complex technical concepts to diverse audiences. The ASSA ABLOY Group, a global leader in access solutions, values results over titles or backgrounds. They empower their employees to build their careers around their aspirations and support them with feedback, training, and development opportunities. With a diverse and inclusive team, the company encourages different perspectives and experiences to drive innovation and growth.,
Posted 1 month ago
10.0 - 15.0 years
4 - 8 Lacs
Mumbai, Maharashtra, India
On-site
Key Responsibilities: Integrate data from various sources to create a comprehensive view of the SMS, Flashcall and Omnichannel performance Analyse large datasets to identify trends, patterns, correlations, and anomalies. Partner with cross-functional teams to deliver automated reporting and dashboards Create data visualizations and reports to present data-driven insights. Apply statistical methods and data visualization techniques to present findings. Document data analysis processes, methodologies, and results for future reference. Collaborate with cross-functional teams to integrate data from different sources/platform and help create a unified view Conceptualize and run analyses that generate insights on traffic pattern to help further develop and improve market position. (Potentially short term/until platforms harmonization) Supervise a BPO team in charge of suppliers cost management, coach team and support automation of the process Data analysis and modeling: Extract, clean, and transform complex pricing data from various sources (CRM, sales data, market research). Build statistical models and predictive algorithms to forecast pricing impacts on revenue and profitability. Analyze customer segmentation data to identify price sensitivity and optimal pricing tiers. Pricing strategy development: Identify market trends and competitor pricing to inform pricing strategy adjustments. Develop and evaluate different pricing models (cost-plus, value-based, competitive) based on market analysis. Conduct sensitivity analysis to assess the impact of pricing changes on key metrics. Business collaboration: Partner with sales teams to identify pricing opportunities and address customer pricing concerns. Collaborate with marketing to align pricing strategies with product positioning and promotions. Present data-driven insights and pricing recommendations to senior management. Reporting and visualization: Create comprehensive pricing reports and dashboards to monitor performance and identify areas for improvement. Utilize data visualization tools to effectively communicate complex pricing insights to stakeholders. Skills Qualification: Education: A bachelor s degree in a relevant field like statistics, computer science or business Technical skills: Advanced proficiency in SQL, data manipulation and analysis tools (e.g., Python, R) Experience with statistical modeling techniques (regression analysis, GLM) Business acumen: Strong understanding of pricing theory and market dynamics Awareness of business goals and financial metrics relevant to pricing decisions Ability to translate data insights into actionable business recommendations Communication skills: Excellent written and verbal communication skills to effectively present complex data analysis to diverse audiences Ability to collaborate effectively with cross-functional teams
Posted 1 month ago
3.0 - 8.0 years
3 - 7 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Design, write, and maintain PL/SQL code , including stored procedures, functions, packages, and triggers. Optimize and tune complex SQL queries for high performance. Work closely with business analysts and application developers to understand data requirements. Develop and maintain ETL processes , data interfaces, and data migration scripts. Perform code reviews , troubleshoot database-related issues, and implement best practices. Ensure data integrity , security, and performance across systems. Document database designs, code logic, and solutions clearly and effectively. Participate in unit testing , system integration testing , and support UAT efforts. Required Skills & Experience: Proficient in Oracle PL/SQL , SQL , and Oracle Database (12c/19c) . Experience writing complex queries, joins, and optimization techniques. Strong knowledge of data modeling , normalization , and relational database concepts . Experience in performance tuning and query optimization . Familiarity with database tools like Toad , SQL Developer , or Oracle Enterprise Manager . Solid understanding of SDLC and Agile methodologies. Working knowledge of Unix/Linux shell scripting is a plus. Nice to Have: Experience with Oracle APEX , Forms/Reports , or BI tools . Familiarity with ETL tools like Informatica, Talend, or Oracle Data Integrator (ODI). Exposure to cloud-based databases (e.g., Oracle Cloud, AWS RDS). Knowledge of version control systems like Git or SVN .
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Visualization Engineer at Zoetis, Inc., you will be an integral part of the pharmaceutical R&D team, contributing to the development and implementation of cutting-edge visualizations that drive decision-making in drug discovery, development, and clinical research. You will collaborate closely with scientists, analysts, and other stakeholders to transform complex datasets into impactful visual narratives that provide key insights and support strategic initiatives. Your responsibilities will include designing and developing a variety of visualizations, ranging from interactive dashboards to static reports, to summarize key insights from high-throughput screening, clinical trial data, and other R&D datasets. You will work on implementing visual representations for pathway analysis, pharmacokinetics, omics data, and time-series trends, utilizing advanced visualization techniques and tools to create compelling visuals tailored to technical and non-technical audiences. Collaboration with cross-functional teams will be a key aspect of your role, as you partner with data scientists, bioinformaticians, pharmacologists, and clinical researchers to identify visualization needs and translate scientific data into actionable insights. Additionally, you will be responsible for maintaining and optimizing visualization tools, building reusable components, and evaluating emerging technologies to support large-scale data analysis. Staying updated on the latest trends in visualization technology and methods relevant to pharmaceutical research will be essential, as you apply advanced techniques such as 3D molecular visualization, network graphs, and predictive modeling visuals. You will also collaborate across the full spectrum of R&D functions, aligning technology solutions with the diverse needs of scientific disciplines and development pipelines. In terms of qualifications, you should possess a Bachelor's or Master's degree in Computer Science, Data Science, Bioinformatics, or a related field. Experience in the pharmaceutical or biotech sectors is considered a strong advantage. Proficiency in visualization tools such as Tableau, Power BI, and programming languages like Python, R, or JavaScript is required. Familiarity with data handling tools, omics and network tools, as well as dashboarding and 3D visualization tools, will also be beneficial. Soft skills such as strong storytelling ability, effective communication, collaboration with interdisciplinary teams, and analytical thinking are crucial for success in this role. Travel requirements for this position are minimal, ranging from 0-10%. Join us at Zoetis India Capability Center (ZICC) in Hyderabad, and be part of our journey to pioneer innovation and drive the future of animal healthcare.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |