Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 Lacs
gurgaon, haryana, india
Remote
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion its a place where you can grow, belong and thrive. Your day at NTT DATA The Sales Analytics Specialist is a seasoned subject matter expert, accountable for driving the success of sales operations through comprehensive data analysis, valuable insights, and strategic decision support. This role operates within a multifaceted environment, encompassing various sales analytics domains, and involves collaborating with cross-functional teams. The primary responsibility of the Sales Analytics Specialist is to provide data-driven support for business planning and strategic decision-making by leveraging their seasoned understanding of the business context. Key responsibilities: Participates in tactical and strategic projects with cross-functional virtual teams to achieve specific business objectives. Responsible for analyzing complex business problems and issues using internal and external data to provide insights to decision-makers. Identifies and interpret trends and patterns in relevant datasets to locate influences and proactively identify meaningful quantitative analysis for distribution to business stakeholders to inform business decisions. Creates documented specifications for reports and analysis based on business needs and required or available data elements. Defines, develops, enhances, and tracks metrics and dashboard requirements to deliver results and provide insight and recommendations on trends. Responsible for data validation using advanced data analysis and tools to ensure analytics are valid, meaningful, and provide actionable and comprehensive insights. Supports the team answer strategic questions, make insightful data-driven business decisions, and properly design new initiatives. Priovides technical advice, consultation, and knowledge to others within the relevant teams. Creates relevant reports and present on trends to convey actionable insights to stakeholders. Performs any other related task as required. Knowledge, Skills and Attributes: Seasoned understanding of advanced data analysis techniques, and the ability to uncover strategic insights from data. Seasoned understanding of the business&aposs sales objectives, market dynamics, and industry trends, with the ability to align data analysis with strategic objectives. Seasoned collaboration skills, enabling effective teamwork with cross-functional teams and senior management. Seasoned ability to translate complex data insights into actionable, understandable strategies for non-technical stakeholders. Excellent communication and presentation skills to convey complex data findings in a clear and actionable manner to non-technical stakeholders. Seasoned understanding of data analysis tools including advanced Excel, advanced PowerBI, and at least one relevant coding language (for example DAX, R, Python etc.). Seasoned understanding of Structured Query Language for managing and querying relational databases. Seasoned knowledge of techniques for transforming and structuring data for analysis and knowledge of ETL processes for data extraction and transformation. Advanced understanding of data security and privacy best practices, especially if working with sensitive or sensitive data. Seasoned understanding of the business domain, industry, and the ability to translate data insights into actionable business recommendations. Academic qualifications and certifications: Bachelor&aposs degree or equivalent in Data Science or related field. Relevant sales analytics certifications desirable. Required experience: Seasoned demonstrated experience in a sales or marketing function as a data analyst. Seasoned demonstrated experience in using PowerBI, statistical and quantitative analysis techniques, data visualizations, data analysis. Seasoned proven track record in creating and optimizing reports and dashboards that contribute to strategic decision support. Seasoned proven experience in providing data-driven support for business planning and strategic decision-making. Workplace type: Remote Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
hyderabad, telangana, india
On-site
Data Engineer Position Overview Role Summary We are searching for a talented and motivated Data Engineer to join our team. The ideal candidate will have expertise in data modeling, analytical thinking, and developing ETL processes using Python. In this role, you will be pivotal in transforming raw data from landing tables into reliable, curated master tables, ensuring accuracy, accessibility, and integrity within our Snowflake data platform. Experience Overall 6+ yrs of experience with relevant experience of 4+ years Main Responsibilities Design, Develop, and Maintain ETL Processes: Build and maintain scalable ETL pipelines in Python to extract, transform, and load data into Snowflake master tables. Automate data mastering, manage incremental updates, and ensure consistency between landing and master tables. Data Modeling: Create and optimize logical and physical data models in Snowflake for efficient querying and reporting. Translate business needs into well-structured data models, defining tables, keys, relationships, and constraints. Analytical Thinking and Problem Solving: Analyze complex datasets, identify trends, and work with analysts and stakeholders to resolve data challenges. Investigate data quality issues and design robust solutions aligned with business goals. Data Quality and Governance: Implement routines for data validation, cleansing, and error handling to ensure accuracy and reliability in Snowflake. Support the creation and application of data governance standards. Automation and Optimization: Seek automation opportunities for data engineering tasks, enhance ETL processes for performance, and scale systems as data volumes grow within Snowflake. Documentation and Communication: Maintain thorough documentation of data flows, models, transformation logic, and pipeline configurations. Clearly communicate technical concepts to all stakeholders. Collaboration: Work closely with data scientists, analysts, and engineers to deliver integrated data solutions, contributing to cross-functional projects with your data engineering expertise. Required Qualifications Bachelors or Masters degree in Computer Science, IT, Engineering, Mathematics, or related field At least 2 years of experience as a Data Engineer or similar role Strong Python skills, including experience developing ETL pipelines and automation scripts Solid understanding of relational and dimensional data modeling Experience with Snowflake for SQL, schema design, and managing pipelines Proficient in SQL for querying and data analysis in Snowflake Strong analytical and problem-solving skills Familiarity with data warehousing and best practices Knowledge of data quality, cleansing, and validation techniques Experience with version control systems like Git and collaborative workflows Excellent communication, both verbal and written Preferred Qualifications In-depth knowledge of Snowflake features like Snowpipe, Streams, Tasks, and Time Travel Experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with workflow orchestration tools like Apache Airflow or Luigi Understanding of big data tools like Spark, Hadoop, or distributed databases Experience with CI/CD pipelines in data engineering Background in streaming data and real-time processing Experience deploying data pipelines in production Sample Responsibilities in Practice Develop automated ETL pipelines in Python to ingest daily CSVs into a Snowflake landing table, validate data, and merge clean records into a master table, handling duplicates and change tracking. Design scalable data models in Snowflake to support business intelligence reporting, ensuring both integrity and query performance. Collaborate with business analysts to adapt data models and pipelines to evolving needs. Monitor pipeline performance and troubleshoot inconsistencies, documenting causes and solutions. Key Skills and Competencies Technical Skills: Python (including pandas, SQLAlchemy); Snowflake SQL and management; schema design; ETL process development Analytical Thinking: Ability to translate business requirements into technical solutions; strong troubleshooting skills Collaboration and Communication: Effective team player; clear technical documentation Adaptability: Willingness to adopt new technologies and proactively improve processes Our Data Environment Our organization manages diverse data sources, including transactional systems, third-party APIs, and unstructured data. We are dedicated to building a top-tier Snowflake data infrastructure for analytics, reporting, and machine learning. In this role, you will influence our data architecture, implement modern data engineering practices, and contribute to a culture driven by data. Show more Show less
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
Agoda is an online travel booking platform that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, along with flights, activities, and more. As part of Booking Holdings and based in Asia, Agoda's diverse team of 7,100+ employees from 95+ nationalities in 27 markets fosters an environment rich in creativity and collaboration. The company's culture of experimentation and ownership enhances the customer experience, allowing them to explore the world. The primary purpose of Agoda is to bridge the world through travel, believing that it enables people to learn, enjoy, and experience the world while fostering empathy, understanding, and happiness. The team at Agoda is driven by a passion to make an impact, leveraging innovative technologies and strong partnerships to make travel easy and rewarding for everyone. The Data department at Agoda is responsible for managing all data-related requirements, aiming to increase the use of data within the company through creative approaches and the implementation of powerful resources. The team hires talented individuals from around the world, equipping them with the knowledge and tools necessary for personal growth and success while upholding the company's culture of diversity and experimentation. The Data team plays a critical role in empowering decision-making processes for various stakeholders within the organization and improving the customer search experience. As a member of the Database Development team at Agoda, you will be involved in database design, data management, and database development integrated with automated Continuous Integration/Continuous Delivery (CI/CD) pipelines. Working closely with product teams, you will provide support and solutions that uphold technical excellence and meet business needs. Your responsibilities in this role will include assisting in designing database schema and architecture, developing high-quality database solutions, optimizing database SQL code for performance, collaborating with product and other teams, utilizing automated database CI/CD pipelines, suggesting enhancements for services and processes, staying updated on advancements in database technology, and understanding database architecture complexities. To succeed in this role, you will need at least 4 years of experience in database development with platforms like Microsoft SQL or Oracle, a Bachelor's degree in Computer Science or a related field, proficiency in relational databases and advanced SQL queries, a strong understanding of MSSQL Server with query optimization skills, excellent English communication skills, problem-solving abilities, a collaborative mindset, and the ability to work effectively in a multicultural team environment. Having knowledge of CI/CD frameworks, automation platforms like Gitlab, Agile methodologies, NoSQL databases, data warehousing, ETL processes, and business intelligence tools will be advantageous for this role. Agoda is an equal opportunity employer and values diversity in its workforce. Your application will be kept on file for future vacancies, and you can request to have your details removed at any time. For more information, please refer to our privacy policy.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Architect, you will be responsible for leading Data related projects in the field of Reporting and Analytics. With over 10 years of relevant work experience, you will design, build, and maintain scalable data lake and data warehouse in the cloud, particularly on Google Cloud Platform (GCP). Your expertise will be crucial in gathering business requirements, analyzing business needs, and defining the BI/DW architecture to deliver technical solutions for complex business and technical requirements. You will create solution prototypes, participate in technology selection, and perform POCs and technical presentations. In this role, you will architect, develop, and test scalable data warehouses and data pipelines architecture using Cloud Technologies on GCP. Your experience in SQL and NoSQL DBMS such as MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, and MongoDB will be essential. You will design and develop scalable ETL processes, including error handling, and demonstrate proficiency in query and program languages like MS SQL Server, T-SQL, PostgreSQL, MySQL, Python, and R. Additionally, you will prepare data structures for advanced analytics and self-service reporting using tools like MS SQL, SSIS, and SSRS. Experience with cloud-based technologies such as PowerBI, Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake, AWS RedShift, Glue, Athena, AWS Quicksight, and Google Cloud Platform will be beneficial. In addition, familiarity with Agile development environment, pairing DevOps with CI/CD pipelines, and having an AI/ML background are considered good to have for this role. (ref:hirist.tech),
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Lead Data Analyst at Ascentt, you will be responsible for leading the end-to-end analytics lifecycle for cost-focused projects. This includes defining key objectives, delivering insights, and making recommendations to senior stakeholders. Your role will involve contributing to analytics initiatives with a focus on cost metrics to ensure alignment with business goals. You will be tasked with defining and implementing robust data models to ensure the scalability and accuracy of cost metrics for reporting and forecasting. Additionally, you will design, measure, and monitor cost-related Key Performance Indicators (KPIs) such as cost-per-unit, cost-of-service, budget utilization, and return on investment (ROI). Your responsibilities will also include creating dashboards and reports that effectively communicate cost metrics and trends to stakeholders, enabling data-driven decisions. In this role, you will conduct advanced analysis through exploratory data analysis, trend forecasting, and scenario modeling to identify cost-saving opportunities and potential risks. You will collaborate closely with data engineering and governance teams to ensure data quality, integrity, and compliance. Furthermore, your role will involve analyzing datasets to uncover trends, patterns, and insights that help the business better understand cost dynamics. Collaboration across teams is essential, as you will work closely with Finance, Operations, and other departments to align analytics with organizational needs and goals. You will partner with data engineers and other team members to ensure the accuracy and reliability of data. Additionally, you will share knowledge and insights with team members to contribute to team growth and foster a collaborative and innovative work environment. To qualify for this role, you should hold a Bachelors or Masters degree in Data Science, Economics, Finance, Statistics, or a related field. You should have 8+ years of experience in data analytics, with demonstrated expertise in cost analytics, financial modeling, and cost optimization. Proficiency in data analysis tools and languages such as SQL, Python, or R is required, along with hands-on experience with BI tools like Tableau, Power BI, or Looker. A strong understanding of database systems, data warehousing, and ETL processes is essential. You should possess a strong analytical mindset with the ability to transform complex data into actionable insights. Excellent written and verbal communication skills are necessary, with experience in presenting findings to stakeholders. The ability to manage multiple priorities and deadlines in a fast-paced environment is also crucial for success in this role.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Visualization Engineer at Zoetis, Inc., you will be part of the pharmaceutical R&D team responsible for creating impactful visualizations that drive decision-making in drug discovery, development, and clinical research. Your role will involve collaborating with scientists, analysts, and other stakeholders to translate complex datasets into clear and actionable visual insights. Your key responsibilities will include: - Designing and developing interactive and static visualizations for exploratory, descriptive, comparative, and predictive analyses. - Creating dashboards and reports summarizing key insights from high-throughput screening, clinical trial data, and other R&D datasets. - Implementing visual representations for pathway analysis, pharmacokinetics, omics data, and time-series trends. - Collaborating with cross-functional teams to identify visualization needs and tailor visual insights for technical and non-technical audiences. - Building reusable visualization components and frameworks to support large-scale data analysis. - Evaluating and recommending tools and platforms for effective data visualization, including emerging technologies. - Integrating, cleaning, and structuring datasets for visualization purposes in alignment with pharmaceutical R&D standards, compliance, and security requirements. - Staying updated on the latest trends in visualization technology and applying advanced techniques like 3D molecular visualization, network graphs, and predictive modeling visuals. You will work closely with various teams within Zoetis, including Animal Health Research & Development and Zoetis Tech & Digital, to align technology solutions with the diverse needs of scientific disciplines and development pipelines. While you will not have direct reports, you will have matrix leadership responsibilities within each project team and may manage project resources onboarded externally. The ideal candidate for this role should have a Bachelor's or Master's degree in Computer Science, Data Science, Bioinformatics, or a related field. Experience in the pharmaceutical or biotech sectors is a strong plus. You should possess expertise in visualization tools such as Tableau, Power BI, Plotly, ggplot2, Matplotlib, Seaborn, D3.js, or equivalent, as well as proficiency in programming languages like Python, R, or JavaScript. Experience with SQL, Pandas, NumPy, and ETL processes is also required. Soft skills such as strong storytelling ability, effective communication, collaboration with interdisciplinary teams, and analytical thinking are essential for success in this role. Travel requirements are minimal, ranging from 0-10%. Join Zoetis in advancing care for animals and contribute to pioneering innovation in the field of animal healthcare.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Tableau Developer, you will be responsible for designing, developing, and optimizing data visualizations and business intelligence (BI) solutions using Tableau. Your role will involve collaborating with business stakeholders, data analysts, and engineers to convert complex datasets into actionable insights. Your key responsibilities will include designing, developing, and maintaining interactive dashboards and visual reports in Tableau. You will optimize Tableau dashboards for performance and usability, work with SQL databases and data sources for data extraction, cleaning, and transformation. Additionally, you will collaborate with business teams to gather requirements and translate them into BI solutions, implement best practices in data visualization and reporting, and troubleshoot Tableau performance issues. Furthermore, you will be expected to mentor junior developers, provide technical leadership, and ensure the successful implementation of BI solutions. Your qualifications should include a minimum of 5 years of experience in Tableau development, strong SQL and data modeling skills, experience in integrating Tableau with cloud and on-premise databases, knowledge of ETL processes and data warehousing, and the ability to work with large datasets and optimize queries. If you possess the requisite skills in Tableau Desktop, Tableau Server, SQL, Data Visualization, Dashboard Development, Python, ETL Processes, and Data Analysis, we encourage you to apply for this exciting opportunity to showcase your expertise in transforming data into valuable insights and driving business decision-making processes.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
andhra pradesh
On-site
As a Database Architect at our organization, you will be responsible for designing and implementing robust, scalable, and secure database systems tailored to meet the business requirements. Your key responsibilities will include developing and optimizing database structures, collaborating with cross-functional teams to understand data requirements, and creating efficient database solutions. It will be essential for you to monitor database performance, troubleshoot issues, and implement performance tuning techniques. In this role, you will define and enforce database standards, policies, and procedures to ensure consistency and reliability across the organization. You will also be involved in data migration, integration, and ensuring data integrity across different platforms. Additionally, you will work on backup, recovery, and disaster recovery strategies for databases to ensure high availability and business continuity. As a Database Architect, you will be expected to research and implement new database technologies and techniques to optimize business processes and support growth. You will review database design and implementation to ensure compliance with best practices, security standards, and regulations such as GDPR. Conducting regular audits of database systems and providing recommendations for improvements will also be part of your responsibilities. To qualify for this role, you should have proven experience as a Database Architect or a similar role, with strong knowledge of database technologies including SQL, NoSQL, relational databases, and distributed databases. Proficiency in database design, performance tuning, troubleshooting, and experience with cloud database solutions and containerized databases will be beneficial. Expertise in data modeling, schema design, and relational database management systems is essential. Preferred qualifications include a Bachelor's degree in Computer Science or a related field, experience with big data technologies, familiarity with database automation tools, and knowledge of data governance and compliance standards. Strong analytical, problem-solving, and communication skills are key requirements for this role. If you thrive in a collaborative, fast-paced environment and have 5-9 years of relevant experience, we would like to hear from you. This is a full-time position located in Visakhapatnam. If you meet the requirements and are ready to take on this challenging role, we encourage you to apply for Job ID 1007.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. And EY is counting on your unique voice and perspective to help EY become even better. Join us and build an exceptional experience for yourself while contributing to a better working world for all. As a PSCM Supervising Associate/Assistant Manager, you will play a critical role in building insightful dashboards that drive decision-making for the Experience Management team, business leadership, and Talent leadership. Your expertise in data engineering and reporting will be essential in transforming business requirements into actionable insights. EY Global Delivery Services (EY GDS) is a dynamic and global service delivery network of over 75,000 professionals working across borders to provide innovative and strategic business solutions to EY member firms and EY clients globally. The team is integral to enhancing operational efficiency and supporting data-driven decision-making across the organization. We offer a great place to work for every person joining EY. At GDS, you will have the opportunity to develop your professional skills in a truly global environment. You will learn and gain experiences from industry-leading professionals and a supportive leadership team. **Your Key Responsibilities:** - Understanding business requirements from stakeholders and translating them into effective Power BI dashboards for reporting. - Collaborating with the Experience Management team, business leadership, and Talent leadership to gather insights and refine reporting needs. - Utilizing SQL and data engineering techniques to extract, transform, and load data from various databases into Power BI. - Designing and developing interactive dashboards and reports that provide actionable insights and enhance decision-making processes. - Ensuring data accuracy and integrity in reporting by implementing best practices in data management and governance. - Providing training and support to end-users on how to effectively utilize dashboards and interpret data insights. - Continuously monitoring and optimizing dashboard performance to ensure a seamless user experience. **Skills And Attributes For Success:** The ideal candidate will possess: - Strong technical skills in Power BI, SQL, and data engineering. - Experience with database management and data visualization techniques. - Excellent analytical skills with the ability to interpret complex data sets and provide actionable insights. - Strong communication and interpersonal skills to collaborate effectively with various stakeholders. - A proactive approach to problem-solving and a keen attention to detail. **To Qualify for This Role You Must Have:** - 5+ years of relevant experience in data analytics, reporting, or a related field. - Proficiency in Power BI and SQL, with a solid understanding of data modeling and visualization principles. - Familiar with ETL processes and data integration from multiple sources. - Familiar understanding of database management systems and data warehousing concepts. - Excellent verbal and written communication skills. **What We Look For:** We are looking for committed, self-motivated, and driven professionals with a proven track record in data analytics. The successful candidate will have a strong analytical mindset and the ability to partner with business leaders to understand their reporting needs and provide effective solutions. A focus on continuous improvement and a culture of innovation is essential. **What We Offer:** EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across multiple global locations and with teams from all EY service lines, playing a vital role in the delivery of the EY growth strategy. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. - Continuous learning: You'll develop the mindset and skills to navigate whatever comes next. - Success as defined by you: We'll provide the tools and flexibility, so you can make a meaningful impact, your way. - Transformative leadership: We'll give you the insights, coaching, and confidence to be the leader the world needs. - Diverse and inclusive culture: You'll be embraced for who you are and empowered to use your voice to help others find theirs. EY exists to build a better working world, helping to create long-term value for clients, people, and society while building trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Manager, Data Architecture at McDonald's Corporation in Hyderabad, you will be responsible for designing and managing data architectures that ensure seamless integration, quality, and governance of enterprise data systems to support business objectives. Your primary responsibilities will include designing, implementing, and overseeing scalable data architectures to support enterprise data systems. You will collaborate with engineers to implement ETL/ELT processes, support data integration from various sources, and work on maintaining data quality and governance to meet business and regulatory requirements. Additionally, you will work on aligning data architecture with business objectives, evaluating and integrating new data technologies, and troubleshooting data issues and performance bottlenecks. To excel in this role, you should have proficiency in data modelling, database design, and data integration techniques. Experience with data architecture frameworks and tools such as TOGAF, ER/Studio, and Talend is essential. Strong SQL skills, knowledge of cloud data services, and big data concepts are important. You should also have a solid understanding of data governance, quality, and compliance standards, and the ability to communicate technical concepts clearly. Ideally, you should have a background in Computer Science, Information Systems, or a related field with a bachelor's degree or higher. A minimum of 5 years of professional experience in data architecture or a related field is required. This is a full-time role based in Hyderabad, India, with a hybrid work mode. Join us at McDonald's Corporation to contribute to impactful solutions for the business and customers across the globe through innovative data architecture.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Solution Manager at Entytle, you will play a crucial role in implementing architecture for our esteemed customers, who are leading equipment manufacturers globally. You will leverage AI-driven Installed Base Intelligence to drive growth in the Aftermarket sector. Your responsibilities will include: - Proficiency in SQL to handle large datasets and write complex queries. - Experience in utilizing visualization tools like PowerBI for data analysis and reporting. - Knowledge of ETL processes and data mining techniques. - Understanding and interpreting customer business requirements to design and architect solutions for Data Analysis projects. - Deep expertise in Database design, modelling, and governance. - Leading and mentoring teams on technical implementations of solutions across the organization. - Familiarity with performance modelling techniques and hands-on experience in ETL processes. - Ideally, you should have 5+ years of experience in the field and possess a strong grasp of the Manufacturing domain. In addition to the above responsibilities, you should have: - Proficiency in Data visualization tools, particularly Power BI, with at least 2 years of experience. - Experience working with databases such as PostgreSql, Sql Server, and MySQL. - Knowledge of DataPrep tools will be considered an advantage. This position requires occasional client interactions in the USA and Europe with less than 20% travel expected. The ideal candidate should have a minimum of 8 years of experience in similar roles. This is a full-time position based in Pune, Maharashtra, with the workplace type being Work from Office.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
Cprime, a Goldman Sachs and Everstone Capital portfolio company, is not just a full-service consulting firm - we are your strategic partner for driving innovation and agility in your business. Trusted globally, Cprime provides strategic and technical consulting, coaching, and training to businesses at the forefront of digital transformation. With over two decades of experience, we have refined our expertise to assist organizations in adapting to the rapidly changing market dynamics. In a world where every business heavily relies on software, embracing change is essential to avoid being left behind. We are more than mere consultants; we are passionate problem solvers dedicated to helping your organization thrive in a technology-driven world. Our environment fosters innovation, encourages growth, and celebrates diversity. We challenge each other continuously to work smarter and embrace new ideas, offering our team members flexibility, collaborative opportunities, and a fun work atmosphere. We are currently looking for a skilled PowerBI Senior Developer Analyst to join our team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence solutions utilizing Microsoft PowerBI. This role demands a strong analytical mindset, exceptional problem-solving abilities, and the capacity to work collaboratively with cross-functional teams to deliver data-driven insights. **What You Will Do:** - Design, develop, and deploy PowerBI reports and dashboards to provide actionable insights to stakeholders. - Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications. - Optimize PowerBI solutions for performance and scalability. - Ensure data accuracy and integrity by implementing data validation and quality checks. - Provide training and support to end-users on PowerBI tools and functionalities. - Stay updated with the latest PowerBI features and industry trends to enhance reporting capabilities. - Troubleshoot and resolve issues related to PowerBI reports and dashboards. - Document processes, methodologies, and best practices for PowerBI development. **Qualifications And Skills:** - Bachelors degree in Computer Science, Information Technology, or a related field. - Overall 8-10 years of experience with a minimum of 5 years as a PowerBI Developer or in a similar role. - Relevant MS PowerBI Developer Certifications. - Strong proficiency in PowerBI, including DAX and Power Query. - Experience with data modeling, data warehousing, and ETL processes. - Familiarity with SQL and database management systems is preferred. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Ability to work independently and as part of a team. - Experience with other BI tools (e.g., Tableau, Qlik) is a plus. **What We Believe In:** At Cprime, we believe in promoting social justice action internally, within the industry, and in our communities. We view part of our mission as expanding the minds, hearts, and opportunities of our Cprime teammates and the broader community to include those who have been historically marginalized. **Equal Employment Opportunity Statement**,
Posted 2 weeks ago
3.0 - 10.0 years
0 Lacs
maharashtra
On-site
As a Spark Developer, you will play a crucial role in designing, developing, and optimizing large-scale data processing pipelines using Apache Spark. Your responsibilities will include designing and implementing scalable, high-performance data pipelines, optimizing Spark applications for efficient processing of large datasets, collaborating with cross-functional teams to understand data requirements, integrating Spark jobs with various data sources, monitoring and troubleshooting Spark jobs, writing clean and well-documented code, implementing best practices for data processing, security, and storage, and staying up to date with the latest trends in big data technologies. To excel in this role, you should have 6 to 10 years of experience as a Senior Developer or 3 to 6 years of experience as a Junior Developer in big data development, with hands-on experience in Apache Spark. Proficiency in programming languages like Python, Java, or Scala is essential, along with a strong understanding of distributed computing concepts and frameworks. Experience with Hadoop, Hive, or related big data technologies, familiarity with cloud platforms such as AWS, Azure, or GCP, knowledge of ETL processes and tools, and experience in performance tuning and optimization of Spark jobs are also required. Preferred qualifications for this position include experience with real-time data processing using Spark Streaming or Apache Kafka, knowledge of SQL and database design principles, familiarity with machine learning workflows and libraries, certification in big data technologies or cloud platforms, and a location preference for Mumbai. If you possess strong problem-solving skills, analytical thinking, and a passion for working with big data technologies, we encourage you to apply for this exciting opportunity.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a skilled Tableau Dashboard Developer with expertise in BigQuery and Business Intelligence (BI), you will be responsible for developing and maintaining interactive dashboards and reports using Tableau. Your role will involve working with BigQuery to design and optimize data models for efficient data extraction and visualization. Collaborating with business stakeholders to understand their requirements and translate them into effective BI solutions will be a key aspect of your responsibilities. You must have 3-6 years of hands-on experience in Tableau development and Business Intelligence, along with strong proficiency in SQL and BigQuery. Your ability to optimize SQL queries for performance and implement data visualization best practices will be crucial in ensuring clarity, accuracy, and user engagement. Experience in ETL processes, working with large datasets in cloud environments, and integrating Tableau with various data sources will also be required. In addition, you will leverage Google Connected Sheets and other BI tools for enhanced data accessibility and collaboration. Ensuring data governance, accuracy, and compliance by implementing best practices in data management will be essential. Your analytical and problem-solving skills, attention to detail, and excellent communication skills will enable you to collaborate effectively with business teams and translate data insights into actionable strategies. Preferred qualifications include experience in Python for data manipulation, exposure to other BI tools such as Looker, Power BI, or Redash, and an understanding of data security and governance best practices. If you are a proactive individual with a passion for developing high-quality visualizations and driving data-driven decision-making, we invite you to join our team for a Contract role of 6 months as a Tableau Dashboard Developer with expertise in BigQuery and BI.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
bhopal, madhya pradesh
On-site
Our company is currently seeking a skilled PostgreSQL Database Developer to join our team. As a PostgreSQL Database Developer, you will be responsible for designing, implementing, and maintaining our database systems. You should possess a solid background in database development, performance optimization, and data modeling. Your responsibilities will include designing, implementing, and maintaining database schemas in PostgreSQL, ensuring efficiency, reliability, and scalability. You will optimize SQL queries for enhanced performance, identify and resolve performance bottlenecks, and manage data migration and integration processes while maintaining data consistency and integrity. In addition, you will be developing and maintaining stored procedures, functions, and triggers to support application requirements, implementing database security policies, managing user roles and permissions, and overseeing database backup and recovery processes to ensure data availability and reliability. Collaboration with cross-functional teams like application developers, system administrators, and business analysts is essential to understand database requirements. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - Proven experience as a Database Developer focusing on PostgreSQL. - Thorough knowledge of database design principles, normalization, and data modeling. - Proficiency in writing and optimizing SQL queries. - Experience with performance tuning and query optimization techniques. - Familiarity with database security best practices and access control. - Hands-on experience with data migration, integration, and ETL processes. - Proficiency in scripting languages (e.g., Python, Bash) for automation. - Knowledge of backup and recovery processes. - Excellent communication and collaboration skills. - Ability to work independently and as part of a team. Preferred Skills: - Experience with PostgreSQL replication and clustering. - Familiarity with NoSQL databases. - Knowledge of cloud database solutions (e.g., AWS RDS, Azure Database for PostgreSQL). - Understanding of DevOps practices and tools. If you meet the qualifications and are looking to contribute your expertise to our dynamic team, we invite you to apply for the PostgreSQL Database Developer position with us.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a skilled and motivated Data Engineer to become a part of our dynamic team. The ideal candidate should possess experience in designing, developing, and maintaining scalable data pipelines and architectures utilizing technologies such as Hadoop, PySpark, ETL processes, and Cloud platforms. As a Senior Data Engineer with 4-8 years of experience, you will be responsible for designing, developing, and maintaining data pipelines to process large-scale datasets. You will play a crucial role in building efficient ETL workflows to transform and integrate data from various sources. Additionally, you will be tasked with developing and optimizing Hadoop and PySpark applications for data processing while ensuring data quality, governance, and security standards are adhered to across systems. In this role, you will also be required to implement and manage Cloud-based data solutions using platforms such as AWS, Azure, or GCP. Collaboration with data scientists and analysts to support business intelligence initiatives will be an integral part of your responsibilities. Troubleshooting performance issues and optimizing query executions in big data environments will be another key aspect of your role. It is essential to stay updated with industry trends and advancements in big data and cloud technologies to contribute effectively to the team. The ideal candidate should possess strong programming skills in Python, Scala, or Java. Hands-on experience with the Hadoop ecosystem (HDFS, Hive, Spark, etc.) and expertise in PySpark for distributed data processing are required. Proficiency in ETL tools and workflows such as SSIS, Apache Nifi, or custom pipelines is essential. Experience with Cloud platforms (AWS, Azure, GCP) and their data-related services is a must. Knowledge of SQL and NoSQL databases, along with familiarity with data warehousing concepts and data modeling techniques, is desired. Strong analytical and problem-solving skills will be beneficial in this role. If you are interested in this opportunity, please contact us at +91 7305206696 or email us at saranyadevib@talentien.com,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for fostering effective collaboration with diverse teams across various functions and regions. Your primary tasks will include collecting and analyzing large datasets to identify patterns, trends, and insights that will inform business strategies and decision-making processes. Additionally, you will be developing and maintaining reports, dashboards, and other tools to monitor and track supply chain performance. You will also play a key role in assisting in the development and implementation of supply chain strategies that align with business objectives, as well as identifying and suggesting solutions to potential supply chain risks and challenges. Furthermore, you will be expected to build data models, perform data mining to discover new opportunities and areas of improvement, and conduct data quality checks to ensure accuracy, completeness, and consistency of data sets. Your background should ideally include 5+ years of hands-on experience in Data Engineering within the supply chain domain. Proficiency in Azure Data Engineering technologies is a must, including ETL processes, Azure Data Warehouse (DW), Azure Databricks, and MS SQL. You should also possess strong expertise in developing and maintaining scalable data pipelines, data models, and integrations to support analytics and decision-making. Experience in optimizing data workflows for performance, scalability, and reliability will be highly valued. In terms of competencies, you should embody TE Connectivity's values of Integrity, Accountability, Inclusion, Innovation, and Teamwork. ABOUT TE CONNECTIVITY: TE Connectivity plc (NYSE: TEL) is a global industrial technology leader that focuses on creating a safer, sustainable, productive, and connected future. Their broad range of connectivity and sensor solutions enable the distribution of power, signal, and data to advance next-generation transportation, energy networks, automated factories, data centers, medical technology, and more. With a workforce of more than 85,000 employees, including 9,000 engineers, collaborating with customers in approximately 130 countries, TE ensures that EVERY CONNECTION COUNTS.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
You will be working for a leading housing finance institution, Mahindra Home Finance, dedicated to empowering semi-urban and rural communities with tailored credit solutions. The company focuses on inclusive growth, leveraging technology-driven processes to simplify lending with transparency and responsible financial support. Trusted for its customer-centric approach, it plays a vital role in making home ownership accessible to underserved markets. As a Database Developer, your key responsibilities will include designing, developing, and optimizing databases and related ETL processes. You will also be responsible for maintaining stored procedures, views, triggers, indexes, and constraints to support business needs. Performance tuning of SQL queries and SSIS packages for optimal efficiency is an essential part of your role. You will work closely with Loan Management Systems to support business operations and ensure data integrity. Collaboration with cross-functional teams to understand requirements and implement database solutions is also a key aspect of your job. Regular reviews and improvements on existing database processes and SQL code will be part of your responsibilities. To excel in this role, you must have technical expertise in Oracle 19c/21c, PL/SQL, SQL Server, and related RDBMS concepts. Your experience should include database design and development, ETL processes, and stored procedure creation. Proficiency in performance tuning for SQL queries and SSIS packages to optimize database performance is crucial. A strong understanding of RDBMS principles, including views, triggers, stored procedures, indexes, and constraints, is necessary. Prior experience in the BFSI domain with exposure to Loan Management Systems is preferred. Team lead experience of at least 2 years with a team size of 5+ is a requirement, and experience with LMS Module/Application is preferred. The ideal candidate for this role will have a Bachelor's degree in Computer Science, Information Technology, or a related field. The position is based in Kurla West, Mumbai, and requires 8-12 years of relevant experience.,
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Project Role : Product Owner Project Role Description : Drives the vision for the product by being the voice of the customer, following a human-centered design approach. Shapes and manages the product roadmap and product backlog and ensures the product team consistently deliver on the clients needs and wants. Validates and tests ideas through recurrent feedback loops to ensure knowledge discovery informs timely direction changes. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Product Owner, you will drive the vision for the product by being the voice of the customer, following a human-centered design approach. Your typical day will involve shaping and managing the product roadmap and backlog, ensuring that the product team consistently delivers on client needs and wants. You will validate and test ideas through recurrent feedback loops, ensuring that knowledge discovery informs timely direction changes. This role requires a proactive approach to understanding customer requirements and translating them into actionable tasks for the team, fostering collaboration and innovation throughout the product development process. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate regular meetings to ensure alignment and progress towards product goals. - Analyze market trends and customer feedback to inform product strategy. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Strong understanding of data modeling and ETL processes. - Experience with agile methodologies and product management tools. - Ability to translate complex technical concepts into clear business requirements. - Familiarity with data governance and compliance standards. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Warehouse. - This position is based at our Bengaluru office. - A 15 years full time education is required. Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data architecture, ensuring that the solutions you develop are scalable and efficient. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data models that meet business needs. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies. - Strong understanding of ETL processes and data integration techniques. - Experience with data warehousing concepts and best practices. - Familiarity with data quality frameworks and data governance principles. - Knowledge of database management systems and SQL. Additional Information: - The candidate should have minimum 3 years of experience in Data Modeling Techniques and Methodologies. - This position is based at our Bengaluru office. - A 15 years full time education is required. Show more Show less
Posted 2 weeks ago
4.0 - 10.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Basic Qualifications Bachelors in Computer Science, statistics, Math, Engineering or related field. 4-10 years of experience in Business Intelligence/Digital Transformation, supporting a Financial Operations function. Ability to communicate clearly with end users, development managers and other stakeholders. Proficient in Business Process Model and Notation (BPMN). Proficient in Data Visualization with tools such as Tableau. Proficient in db query languages such as SQL. Experience in gathering and documenting requirements with full testing traceability. Experience using project tracking tools such as JIRA or Confluence. Experience in predictive analytics using tools such as Alteryx OR Python. Create and maintain ETL processes using a vast array of tools. Analytical, self-motivated, detail oriented with strong problem solving skills. Ability to work within a high-risk environment and meet challenging deadlines and targets. Show more Show less
Posted 2 weeks ago
4.0 - 9.0 years
0 - 0 Lacs
hyderabad
Work from Office
Minimum of 5-7 years of experience in data migration, data engineering, or ETL development, with a proven track record of successful migrations and leadership skills. Strong experience in using data tools and platforms such as SQL, Python, R, SSIS, Azure, or similar. Experience in working with large datasets and various cloud platforms. Previous experience in a leadership role or mentoring junior engineers is highly preferred. Advanced proficiency in SQL for data querying and management. Proven ability to troubleshoot migration issues, optimize performance, and work with large datasets. Experience with code versioning (Git, Bitbucket, etc.). Familiarity with data warehousing and ETL processes is a plus. Excellent problem-solving skills with the ability to approach complex business challenges analytically. Strong communication and presentation skills, with the ability to translate technical information to non-technical stakeholders. Detail-oriented with a high degree of accuracy in analysis and execution. Ability to work independently and as part of a team in a fast-paced environment.
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Job Description Job Title: Database Developer with Database Administration Exposure We are seeking a skilled SQL Developer who also brings working knowledge of Database Administration . This hybrid role involves developing and optimizing SQL-based solutions while contributing to the stability, performance, and security of enterprise database systems. The ideal candidate will be comfortable writing complex queries and supporting essential database operations in both on-premises and cloud environments. About the Team: The CRM India team provides a fast-paced, dynamic work environment where engineers work closely to provide software services to patients with Cardiac conditions and to physicians treating such patients. You will collaborate with other engineers in Architecture, Development, Product, Infrastructure and DevOps teams to support ongoing and prospective product features. We're passionate about building products that improve the quality of life for patients. We like to learn along the way and depend on everyone's input to help us grow as a team. Employees act on their strong desire to make a difference, partner with others and put ideas into action. Employees are engaged by a work culture that is team oriented, fast-paced and progressive. Key Responsibilities: Develop, optimize, and maintain complex SQL queries, stored procedures, views, and functions Design and maintain secure, scalable, and efficient data structures Analyze and validate logical and physical database designs Support ETL processes and data integration across systems Monitor and tune database performance to ensure responsiveness and reliability Assist in implementing and managing backup and recovery strategies Participate in database deployments, upgrades, and configuration changes Troubleshoot and resolve database-related issues, including performance bottlenecks and outages Enforce data access controls and security policies Collaborate with development and infrastructure teams to ensure efficient data access and system stability Contribute to the development of database standards, documentation, and best practices Work with Oracle, PostgreSQL, SQL Server environments Preferred Qualifications: Bachelor's degree in computer engineering, Computer Science, or a related field 5u20137 years of experience in SQL development, with exposure to database operations Proficiency in Oracle, PostgreSQL, SQL Server database services. Strong command of SQL (DDL, DML, DCL), indexing, query optimization, and performance tuning Experience with database monitoring, diagnostics, and capacity planning Knowledge of database design, maintenance and storage in Azure a plus Understanding of the software development life cycle (SDLC) and DevOps practices Strong analytical and communication skills Ability to manage multiple priorities in a cross-functional team environment
Posted 2 weeks ago
7.0 - 12.0 years
12 - 16 Lacs
gurugram
Work from Office
Job Title: Senior Manager Data Analytics (Tableau & Database Management) Location: Gurgaon, India About the Role We are seeking a highly skilled Senior Manager Data Analytics with strong expertise in Tableau, database management, data optimization, and dashboarding . The ideal candidate will have 5+ years of hands-on experience , excellent problem-solving skills, and a strong mathematical background. This role requires both technical proficiency and team management experience , as the person will lead a team of analysts to drive data insights and business impact. Key Responsibilities Lead the end-to-end development of interactive Tableau dashboards and data visualization solutions. Manage and optimize databases, data flows, and ETL processes to ensure data accuracy, performance, and scalability. Partner with business stakeholders to gather requirements, design KPIs, and deliver actionable insights. Ensure data quality, governance, and optimization across all reporting layers. Provide leadership and mentorship to a team of analysts, fostering a culture of continuous learning and collaboration. Solve complex business problems using data-driven approaches and advanced mathematical/statistical techniques. Drive process improvements by identifying gaps, optimizing workflows, and implementing automation where possible. Required Skills & Experience 5+ years of experience in data analytics with strong expertise in Tableau (dashboarding, visualization, calculations, advanced charts) with Overall min 7 Years of experience. Proven experience in managing large databases, SQL queries, ETL pipelines, and data optimization . Strong background in mathematics, statistics, and analytical problem solving . Prior team management experience with the ability to lead, coach, and motivate analysts. Hands-on experience with data modeling, performance tuning, and automation . Excellent communication skills with the ability to translate data insights into business impact. Good to Have Experience with Python/R for analytics and automation. Contact Centre reporting Knowledge of cloud platforms (AWS, GCP, or Azure) for data management. Exposure to BI tools apart from Tableau (Power BI, QlikView, etc.). Education Bachelors or masters degree in mathematics, Statistics, Computer Science, Engineering, or related field . Contact Person HR Supriya- 9289327281
Posted 2 weeks ago
7.0 - 10.0 years
6 - 9 Lacs
surat, gujarat, india
On-site
Job Summary As a Power BI Developer, you will leverage your expertise in data visualization, DAX/SQL, and UI/UX design principles using Figma to build intuitive and insightful dashboards. You will work closely with business stakeholders to gather requirements, prototype user interfaces, design scalable data models, and deliver actionable reporting solutions. Your role will be critical in transforming complex data into user-friendly, performance-optimized, and secure visualizations within the Power BI ecosystem. Key Responsibilities Dashboard Design and Development Design, develop, and deploy interactive dashboards and visual reports using Power BI Desktop and Power BI Service. UI/UX Prototyping with Figma Collaborate with users to translate reporting needs into wireframes, mockups, and prototypes using Figma. Convert Figma designs into production-ready Power BI dashboards that adhere to modern design and UX standards. DAX Development Develop and optimize complex DAX calculations, including measures, calculated columns, and time intelligence functions. SQL Querying and Optimization Write and optimize complex SQL queries using joins, window functions, CTEs, and stored procedures across various databases. Data Modeling Design efficient and scalable data models using dimensional modeling concepts (star and snowflake schemas) and best practices. Security Implementation Implement row-level security (RLS) and other access controls to ensure data protection within Power BI solutions. Performance Tuning Optimize Power BI reports and data models for speed, responsiveness, and usability. Data Source Integration Connect Power BI to diverse data sources including SQL Server, Azure Synapse, APIs, and cloud databases. Stakeholder Communication Present reports and insights to business users clearly and effectively, bridging technical and business understanding. Requirements Gathering Lead and participate in requirement gathering sessions through interviews, workshops, and documentation analysis. Agile Collaboration Contribute to Agile/Scrum teams by participating in sprint planning, daily standups, retrospectives, and timely task delivery. Documentation Maintain comprehensive technical documentation for dashboards, data models, and processes. Continuous Improvement Stay updated with new Power BI and Figma features, and implement improvements to enhance dashboard functionality and aesthetics. Required Skills Minimum of 7 years of experience in Business Intelligence and Power BI development. Strong hands-on experience with Power BI Desktop and Service. Proficiency in DAX with advanced calculation and time intelligence functions. Proficient in Figma for creating wireframes, mockups, and design-to-dashboard translation. Advanced SQL skills with deep experience in joins, CTEs, window functions, and stored procedures. Solid understanding of data warehousing principles and dimensional modeling (star/snowflake schemas). Experience integrating Power BI with cloud-based platforms (e.g., Azure Synapse, SQL Server on Azure) is preferred. Skilled in business requirements elicitation and solution design. Excellent written and verbal communication skills. Strong problem-solving capabilities and analytical thinking. Proven experience working in Agile/Scrum environments. Ability to work collaboratively in cross-functional teams. Education Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |