Jobs
Interviews

8586 Data Modeling Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

punjab

On-site

About the Opportunity: Join a dynamic leader in the education sector that leverages advanced data analytics to drive institutional excellence. Operating in a technology-driven environment, this high-growth organization empowers its teams with a comprehensive data strategy. As a critical part of the on-site team based in India, you will help transform data into actionable insights that support strategic decision-making. Role & Responsibilities: Develop, optimize, and maintain complex SQL queries to support data extraction and reporting. Analyze large datasets to identify trends, correlations, and actionable insights to drive business decisions. Design and implement data models and warehousing solutions to improve reporting accuracy and performance. Collaborate with cross-functional teams to understand data requirements and translate business needs into technical solutions. Create and maintain dashboards and visual reports to present insights to stakeholders. Ensure data integrity and implement best practices for data cleaning and transformation. Skills & Qualifications: Must-Have: Proficiency in SQL with proven experience in writing efficient queries and managing large datasets. Must-Have: 1-3 years of hands-on experience in data analysis and developing data models in a high-paced environment. Must-Have: Strong analytical skills. Benefits & Culture Highlights: Work in a collaborative and innovative on-site environment with opportunities for professional growth. Be part of a mission-driven team that values data-driven insights and continuous learning. Health Insurance. Provident Fund. If you are a detail-oriented SQL Data Analyst ready to leverage your analytical expertise in a vibrant, on-site setting in India, we encourage you to apply and join our transformative team.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers, and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage, and passion to drive life-changing impact to ZS. Our most valuable asset is our people. At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systemsthe ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You'll Do Collaborate with client-facing teams to understand solution context and contribute to technical requirement gathering and analysis. Design and implement technical features leveraging best practices for the technology stack being used. Work with technical architects on the team to validate the design and implementation approach. Write production-ready code that is easily testable, understood by other developers, and accounts for edge cases and errors. Ensure the highest quality of deliverables by following architecture/design guidelines, coding best practices, and participating in periodic design/code reviews. Write unit tests as well as higher-level tests to handle expected edge cases and errors gracefully, as well as happy paths. Use bug tracking, code review, version control, and other tools to organize and deliver work. Participate in scrum calls and agile ceremonies, and effectively communicate work progress, issues, and dependencies. Consistently contribute to researching & evaluating the latest technologies through rapid learning, conducting proof-of-concepts, and creating prototype solutions. What You'll Bring Experience: 2+ years of relevant hands-on experience. CS foundation is a must. Strong command over distributed computing frameworks like Spark (preferred) or others. Strong analytical/problem-solving skills. Ability to quickly learn and become hands-on with new technology and be innovative in creating solutions. Strong in at least one of the Programming languages - Python or Java, Scala, etc., and Programming basics - Data Structures. Hands-on experience in building modules for data management solutions such as data pipeline, orchestration, ingestion patterns (batch, real-time). Experience in designing and implementing solutions on a distributed computing and cloud services platform (but not limited to) - AWS, Azure, GCP. Good understanding of RDBMS, with some experience on ETL is preferred. Additional Skills Understanding of DevOps, CI/CD, data security, experience in designing on a cloud platform. AWS Solutions Architect certification with an understanding of the broader AWS stack. Knowledge of data modeling and data warehouse concepts. Willingness to travel to other global offices as needed to work with clients or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth, and professional development. Our robust skills development programs, multiple career progression options, internal mobility paths, and collaborative culture empower you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client-facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering Applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment. An online application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com,

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

maharashtra

On-site

As a global leader in assurance, tax, transaction and advisory services, EY hires and develops passionate individuals to contribute towards building a better working world. The culture at EY focuses on providing training, opportunities, and creative freedom to help individuals reach their full potential. EY believes in nurturing your career growth and offers limitless potential. Motivating and fulfilling experiences are provided throughout your career journey to support you in becoming your best professional self. The opportunity available is for the role of Associate-National-Business Consulting Risk-CBS - RM - Proj & Operations based in Mumbai. EY Consulting aims to transform businesses through the power of people, technology, and innovation with a client-centric approach. The Consulting division comprises Business Consulting, Technology Consulting, and People Advisory Services, all working towards creating long-term value for clients by addressing their strategic challenges. The key focus areas within the Consulting division include Enterprise Risk, Technology Risk, and Financial Services Risk. These areas help clients in identifying and managing risks effectively to support their business strategies and objectives. The role involves working on risk management CoE with an emphasis on technical excellence. To qualify for this role, candidates must possess a qualification in Risk Management. While prior experience is not mandatory, freshers are encouraged to apply. EY is looking for individuals who can collaborate effectively across client departments, adhere to commercial and legal requirements, and offer practical solutions to complex problems. The ideal candidates should be agile, curious, mindful, and able to maintain positive energy while being adaptable and creative in their approach. EY, with a large global presence and a strong brand reputation, offers a personalized Career Journey to its employees. The organization is dedicated to investing in the skills and learning of its workforce. EY promotes inclusivity and strives to create a balance that allows employees to excel in client service while focusing on their career development and well-being. If you believe you meet the criteria and possess the necessary skills, EY encourages you to apply to be a part of building a better working world. Apply now to join a dynamic team and contribute to EY's mission of driving positive change in the business landscape.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Salesforce Data Cloud Analyst will be instrumental in leveraging Salesforce Data Cloud for revolutionizing the utilization of customer data within the organization. This role is a part of the Data Cloud Business Enablement Team and is dedicated to constructing, overseeing, and enhancing the data unification strategy to drive business intelligence, marketing automation, and customer experience initiatives. You will be responsible for managing data models within Salesforce Data Cloud to ensure seamless data harmonization across diverse sources. Additionally, you will maintain data streams from different platforms into Data Cloud, such as CRM, SFMC, MCP, Snowflake, and third-party applications. Developing and refining SQL queries to convert raw data into actionable insights, as well as creating and managing data tables, calculated insights, and segments for organizational use, are integral aspects of this role. Collaboration with marketing teams to interpret business requirements into effective data solutions, monitoring data quality to uphold accuracy and reliability, and providing training and assistance to business users on leveraging Data Cloud capabilities are key responsibilities you will undertake. Furthermore, creating documentation for data models, processes, and best practices will be part of your routine tasks. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with 3+ years of experience working with Salesforce platforms. Possessing Salesforce Data Cloud certification and prior exposure to Customer Data Platforms (CDPs) are preferred. Proficiency in Tableau CRM or other visualization tools, a background in marketing technology or customer experience initiatives, and Salesforce Administrator or Developer certification are essential requirements for this position. Desired qualifications include advanced knowledge of Salesforce Data Cloud architecture, strong SQL skills, experience with ETL processes and data integration patterns, and an understanding of data modeling principles and best practices. Familiarity with Salesforce Marketing Cloud, MCI & MCP, APIs and data integration techniques, as well as knowledge of data privacy regulations and compliance requirements (GDPR, CCPA, etc.) are also advantageous. Demonstrated experience in data analysis and business intelligence tools, problem-solving abilities, and excellent communication skills will be beneficial for excelling in this role. Moreover, adaptability to new-generation technologies and trends, such as Gen AI and Agentic AI, is considered an added advantage. Novartis is dedicated to fostering an inclusive work environment and diverse teams that reflect the patients and communities served. The company is committed to collaborating with individuals with disabilities and providing reasonable accommodations during the recruitment process. If you require an accommodation due to a medical condition or disability, please contact diversityandincl.india@novartis.com with your request details and contact information. Remember to include the job requisition number in your message. Novartis offers a rewarding work environment where you can make a difference in the lives of people with diseases and their families. By joining Novartis, you will be part of a community of dedicated individuals working together to achieve breakthroughs that positively impact patients" lives. If you are ready to contribute to creating a brighter future, visit https://www.novartis.com/about/strategy/people-and-culture to learn more about opportunities at Novartis. If this Novartis role is not suitable for you, you can join the Novartis talent community to stay informed about relevant career opportunities as they arise by signing up at https://talentnetwork.novartis.com/network. Additionally, you can explore Novartis" handbook to discover the various ways the company supports personal and professional growth at https://www.novartis.com/careers/benefits-rewards.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

gandhinagar, gujarat

On-site

As a Data Analyst specializing in Tableau and Snowflake, you will be responsible for creating and maintaining interactive Tableau dashboards and reports for key business stakeholders. Your role will involve developing, optimizing, and managing Snowflake data warehouse solutions to support analytics and reporting needs. Collaboration with data analysts, business users, and development teams will be essential to gather requirements and deliver effective data solutions. Your expertise in writing and maintaining complex SQL queries for data extraction, transformation, and analysis will be crucial in ensuring data accuracy, quality, and performance across all reporting and visualization platforms. Applying data governance and security best practices to safeguard sensitive information will also be part of your responsibilities. Active participation in Agile development processes, including sprint planning, daily stand-ups, and reviews, will be expected from you to contribute effectively to the team's goals. To excel in this role, you are required to have a minimum of 2-4 years of hands-on experience with Snowflake cloud data warehouse and Tableau for dashboard creation and report publishing. Strong proficiency in SQL for data querying, transformation, and analysis is essential, along with a solid understanding of data modeling, warehousing concepts, and performance tuning. Knowledge of data governance, security, and compliance standards, along with a bachelor's degree in Computer Science, Information Technology, or a related field, is necessary for this position. Experience with cloud platforms such as AWS, Azure, or GCP, a basic understanding of Python or other scripting languages, and familiarity with Agile/Scrum methodologies and development practices will be advantageous. This position of Data Analyst (Tableau and Snowflake) is based in Gandhinagar, with a flexible schedule and shift timings of either 3:30 PM - 12:30 AM or 4:30 PM - 1:30 AM, as per business needs.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a Lead Data Engineer with over 8 years of experience in data engineering and software development. The ideal candidate should possess a strong expertise in Python, PySpark, Airflow (Batch Jobs), HPCC, and ECL. You will be responsible for driving complex data solutions across multi-functional teams. The role requires hands-on experience in data modeling, test-driven development, and familiarity with Agile/Waterfall methodologies. As a Lead Data Engineer, you will be leading initiatives, collaborating with various teams, and converting business requirements into scalable data solutions using industry best practices in managed services or staff augmentation environments. If you meet the above qualifications and are passionate about working with data to solve complex problems, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 - 0 Lacs

chennai, tamil nadu

On-site

As a Salesforce Data Cloud Developer, you will be responsible for creating scalable and impactful data solutions on the Salesforce platform. Your role will involve working closely with various teams to convert data into actionable insights that contribute to business growth. Key responsibilities include designing, developing, and deploying solutions using Salesforce Data Cloud, configuring user settings and data security, and building Data Streams, Data Transforms, and Automation Workflows. You will also be tasked with creating and managing Segments, Activations, and Calculated Insights, translating business requirements into technical solutions, and staying updated with Salesforce releases and best practices. To excel in this role, you should have hands-on experience with Salesforce Data Cloud and the core Salesforce platform, a strong understanding of data modeling, integration patterns, and declarative tools, and expertise in Data Streams, Data Lake, Data Models, and advanced data analysis. Your communication skills should be excellent, enabling you to effectively engage with technical and business stakeholders. If you are passionate about leveraging real-time data, enhancing customer experiences, and implementing intelligent automation, this opportunity is tailored for you. Join us in harnessing the power of data to drive business success.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Join our Offshore Service Engineering team at Siemens Gamesa and utilize your analytical skills to enhance the performance of wind turbines. You will be responsible for analyzing power generation data, supporting turbine upgrades, and investigating underperformance cases at both turbine and wind farm levels. Working closely with engineers, operations, and customer teams globally, you will provide technical insights and explore market opportunities for aftermarket solutions. Your proficiency in Python and MATLAB will be crucial in creating dynamic Power BI dashboards, automating analytics, and shaping future upgrade strategies. Your inquisitiveness, entrepreneurial spirit, and dedication to continuous improvement will be key in making a significant impact in the global renewable energy sector. Your responsibilities will include leading analytics to optimize energy production from turbines and offshore wind farms, conducting performance testing on upgrade solutions with Python and MATLAB, suggesting new test methods for aftermarket products, analyzing operational data to troubleshoot performance issues, developing automation tools and simulation models, and collaborating with various teams to drive improvements and implement new tools and procedures. We are looking for candidates with a bachelor's degree in Engineering, preferably in Mechanical or Aerodynamics, along with 3-5 years of technical experience in data analysis using Python, MATLAB, and Minitab. Proficiency in automatic control systems, electromechanical systems, and industrial instrumentation/testing is required, as well as knowledge of databases, SCADA systems, controller systems, and industrial communications. Strong skills in Power BI dashboard and data modeling, along with experience in technical support roles within the wind or electromechanical industries, are highly desired. A collaborative, proactive, and persistent approach is essential for success in this role, as you will be working in a multicultural and matrixed environment. As part of the Engineering Performance team, you will collaborate with various departments to deliver actionable analytics and drive innovation to enhance the performance of offshore wind assets. The team values open knowledge sharing, multicultural collaboration, and a shared commitment to enabling the global energy transition. Siemens Gamesa, a part of Siemens Energy, is dedicated to sustainable, reliable, and affordable energy solutions. With a focus on driving the energy transition and providing innovative wind turbine solutions, Siemens Gamesa is committed to making a difference in the energy sector. At Siemens Gamesa, diversity is celebrated, and inclusion is valued. The company promotes a culture of inclusion and creativity, where individuals from diverse backgrounds come together to generate power and energize society. Employees at Siemens Gamesa enjoy benefits such as medical insurance coverage, a family floater cover, meal card options, and tax-saving measures as per company policies. Join our team at Siemens Gamesa and be a part of the energy transformation. More information on careers at Siemens Gamesa can be found at https://www.siemens-energy.com/global/en/home/careers/working-with-us.html.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Business Analyst specializing in Product & Pricing Master Data, you will play a crucial role in the large-scale Finance Transformation at a FTSE20 company resulting from the acquisition of Refinitiv by the LSEG in 2021. The finance organization has become complex due to operating on two Enterprise Resource Planning (ERP) systems - Oracle EBS (LSEG) and SAP ECC6 (Refinitiv), leading to dual business processes, control environments, and data structures. The organization has chosen Oracle Fusion as the single ERPM platform and is currently working on Global Design with PwC, the implementation partner. Your responsibilities will involve eliciting and analyzing business requirements from product owners and functions, identifying data needs across various product functions to enhance reusability and flexibility. You will be leading the conceptual design of comprehensive data models, integration methodology, and taxonomies to support product & pricing in Oracle Product Hub. Collaboration with multi-functional teams including finance, technology, operations, regulatory, sales, marketing, and product groups will be essential to ensure successful design and implementation of the Product Data model in the new platform. Furthermore, your role will include documenting data flows, entity relationships, and data dictionaries, identifying data integrity issues and proposing solutions, creating and maintaining data models using industry-standard tools and methodologies, organizing design workshops, establishing testing scope and test scripts for Product Hub, validating data models against business requirements and use cases, supporting implementation teams with guidance on data architecture, ensuring data models comply with privacy and security requirements, and ensuring scalability and reusability for the Product data model designs. To be successful in this role, you should have 5+ years of demonstrable experience in Product Modelling in Oracle Product Hub, 10+ years of experience in successful implementation of Product and Pricing Master data in large and complex programs, proven expertise in simplifying and standardizing complex product structures, experience in integrating product & pricing Data with Q2C processes, good stakeholder management skills, an open-minded approach to driving solutions, and a collaborative and positive attitude. Joining LSEG means being part of a diverse and inclusive organization of over 25,000 people across 70 countries. The company values individual perspectives and believes that a diverse workforce is a strength that fosters collaboration, creativity, and new ideas. LSEG offers a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. Please note that this job description is subject to the privacy notice of the London Stock Exchange Group (LSEG) which outlines the use of personal information and data protection rights. If you are a Recruitment Agency Partner, it is your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a Sr. Data Engineer with over 7 years of experience, specializing in Data Engineering, Python, and SQL. You will be a part of the Data Engineering team in the Enterprise Data Insights organization, responsible for building data solutions, designing ETL/ELT processes, and managing the data platform to support various stakeholders across the organization. Your role is crucial in driving technology and data-led solutions to foster growth and innovation at scale. Your responsibilities as a Senior Data Engineer include collaborating with cross-functional stakeholders to prioritize requests, identify areas for improvement, and provide recommendations. You will lead the analysis, design, and implementation of data solutions, including constructing data models and ETL processes. Furthermore, you will engage in fostering collaboration with corporate engineering, product teams, and other engineering groups, while also leading and mentoring engineering discussions and advocating for best practices. To excel in this role, you should possess a degree in Computer Science or a related technical field and have a proven track record of over 5 years in Data Engineering. Your expertise should include designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment, and developing data products and APIs. Proficiency in SQL/NoSQL databases, particularly Snowflake, Redshift, or MongoDB, along with strong programming skills in Python, is essential. Additionally, experience with columnar OLAP databases, data modeling, and tools like dbt, AirFlow, Fivetran, GitHub, and Tableau reporting will be beneficial. Good communication and interpersonal skills are crucial for effectively collaborating with business stakeholders and translating requirements into actionable insights. An added advantage would be a good understanding of Salesforce & Netsuite systems, experience in SAAS environments, designing and deploying ML models, and familiarity with events and streaming data. Join us in driving data-driven solutions and experiences to shape the future of technology and innovation.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are an exceptional, innovative, and passionate individual looking to grow with NTT DATA. If you want to be part of an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. As a Salesforce Datacloud & Agentforce Solution Architect, you will be responsible for designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform. Your role will involve utilizing Agentforce capabilities to create automated customer interactions across various channels, leveraging strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Your core responsibilities will include architecting and building data integration solutions using Salesforce Data Cloud, unifying customer data from diverse sources. You will implement data cleansing, matching, and enrichment processes to enhance data quality, design and manage data pipelines for efficient data ingestion, transformation, and loading, and collaborate with cross-functional teams to translate business requirements into effective data solutions. Monitoring data quality, identifying discrepancies, and enforcing data governance policies will also be key aspects of your role. Minimum Skills Required: - Expertise in Salesforce Data Cloud features such as data matching, cleansing, enrichment, and data quality rules - Understanding of data modeling concepts and the ability to design data models within Salesforce Data Cloud - Proficiency in utilizing Salesforce Data Cloud APIs and tools for data integration from various sources - Knowledge of data warehousing concepts and data pipeline development Relevant Experience: - Implementing Salesforce Data Cloud for customer 360 initiatives - Designing and developing data integration solutions - Managing data quality issues and collaborating with business stakeholders - Building and customizing Agentforce conversational flows - Training and refining natural language processing models - Monitoring Agentforce performance and analyzing customer interaction data - Seamlessly integrating Agentforce with other Salesforce components - Thoroughly testing Agentforce interactions before deployment Skills to Highlight: - Expertise in Salesforce administration, development, and architecture - Deep knowledge of Agentforce features and configuration options - Familiarity with NLP concepts - Proven ability in conversational design and data analysis - Experience in designing, developing, and deploying solutions on Salesforce Data Cloud platform - Collaboration with stakeholders and building custom applications and integrations - Development and optimization of data models - Implementation of data governance and security best practices - Troubleshooting, debugging, and performance tuning Join NTT DATA, a trusted global innovator in business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team and a strong partner ecosystem, NTT DATA offers a range of services including business and technology consulting, data and artificial intelligence solutions, as well as application and infrastructure management. Be part of a leading provider of digital and AI infrastructure, transforming organizations and society for a digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be joining IntelliDash as a Data Engineering Architect in Coimbatore on a full-time on-site basis. Your primary responsibility will be to design and manage data architectures, develop data models, and ensure data governance. You will oversee Extract Transform Load (ETL) processes, maintain data warehouses, and collaborate with analytics and development teams to uphold data integrity and efficiency. To excel in this role, you should have a strong background in Data Governance and Data Architecture, along with proficiency in Data Modeling and ETL processes. Expertise in Data Warehousing is essential, coupled with excellent analytical and problem-solving skills. Your communication and collaboration abilities will be crucial in working independently and alongside a team. Prior experience in the manufacturing analytics industry would be advantageous. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is required.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Armada is an edge computing startup dedicated to providing computing infrastructure to remote areas with limited connectivity and cloud infrastructure, as well as locations requiring local data processing for real-time analytics and AI at the edge. We are on a mission to bridge the digital divide by deploying advanced technology infrastructure rapidly. To further this mission, we are seeking talented individuals to join our team. As a BI Engineer at Armada, you will play a crucial role in designing, building, and maintaining robust data pipelines and visualization tools. Your focus will be on empowering data-driven decision-making throughout the organization by collaborating closely with stakeholders to translate business requirements into actionable insights through the development and optimization of BI solutions. Key Responsibilities: - Design, develop, and maintain scalable ETL pipelines to facilitate data integration from multiple sources. - Construct and optimize data models and data warehouses to support business reporting and analysis. - Create dashboards, reports, and data visualizations using BI tools such as Power BI, Tableau, or Looker. - Collaborate with data analysts, data scientists, and business stakeholders to understand reporting needs and deliver effective solutions. - Ensure data accuracy, consistency, and integrity within reporting systems. - Perform data validation, cleansing, and transformation as needed. - Identify opportunities for process automation and enhance reporting efficiency. - Monitor BI tools and infrastructure performance, troubleshooting issues when necessary. - Stay updated on emerging BI technologies and best practices. Required Qualifications: - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. - 2-4 years of experience as a BI Engineer, Data Engineer, or similar role. - Proficiency in SQL with experience in data modeling and data warehousing (e.g., Snowflake, Redshift, BigQuery). - Familiarity with BI and data visualization tools (e.g., Power BI, Tableau, Qlik, Looker). - Strong understanding of ETL processes and data pipeline design. - Excellent problem-solving skills and attention to detail. Preferred Skills: - Experience with Python, R, or other scripting languages for data manipulation. - Knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud Platform). - Understanding of version control (e.g., Git) and CI/CD practices. - Familiarity with APIs, data governance, and data cataloging tools. At Armada, we offer a competitive base salary and equity options, providing you with the opportunity to share in our success and growth. If you are intellectually curious, possess strong business acumen, and thrive in a fast-paced, collaborative environment, we encourage you to apply. Join us in making a difference and contributing to the success of Armada.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chandigarh

On-site

We are seeking a Data Scientist with over 3 years of experience in Machine Learning, Deep Learning, and Large Language Models (LLMs) to join our team at SparkBrains Private Limited in Chandigarh. As a Data Scientist, you will be responsible for leveraging your analytical skills and expertise in data modeling to develop and deploy AI-driven solutions that provide value to our business and clients. Your key responsibilities will include gathering, cleaning, and preparing data for model training, designing and optimizing machine learning and deep learning models, integrating Large Language Models (LLMs) for NLP tasks, identifying relevant features for model accuracy, conducting rigorous model evaluation and optimization, creating data visualizations and insights for stakeholder communication, developing and deploying APIs, and collaborating with cross-functional teams while documenting processes effectively. To qualify for this role, you must hold a Bachelors or Masters degree in Computer Science, Data Science, AI, Machine Learning, or a related field, along with a minimum of 3 years of experience as a Data Scientist or AI Engineer. You should also possess proficiency in Python and relevant ML/AI libraries, hands-on experience with LLMs, a strong understanding of NLP, neural networks, and deep learning architectures, knowledge of data wrangling and visualization techniques, experience with APIs and cloud platforms, analytical and problem-solving skills, as well as excellent communication skills for effective collaboration. By joining our team, you will have the opportunity to work on cutting-edge AI/ML projects, be part of a collaborative work environment that emphasizes continuous learning, gain exposure to diverse industries and domains, and benefit from competitive salary and growth opportunities. This is a full-time, permanent position with a day shift schedule from Monday to Friday, requiring in-person work at our Chandigarh office.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Analyst at our company based in Hyderabad, IN, with over 7 years of experience, you will play a crucial role in leveraging your expertise in SQL, data architecture, and the Google Cloud platform (GCP) ecosystem. Your primary responsibility will be to transform complex business queries into actionable insights, shaping strategic decisions, and contributing to the future direction of our Product/Operations team. It is essential to have a combination of technical proficiency, analytical rigor, and exceptional communication skills to collaborate effectively with engineering, product, and business stakeholders. Your key responsibilities will include advanced data analysis using SQL to query, analyze, and manipulate large datasets, creating and maintaining scalable dashboards and reports, managing source code effectively using platforms like GitHub, partnering with product managers to address critical business questions, collaborating with data engineers on data architecture and pipelines, translating complex data findings into compelling narratives for various audiences, leading analytical projects from inception to delivery, and mentoring junior analysts to foster a data-driven problem-solving culture. The ideal candidate should hold a Bachelor's degree in a quantitative field such as Computer Science or Statistics, possess a minimum of 5 years of experience in data analysis or business intelligence, demonstrate expert-level proficiency in SQL, have a solid understanding of data architecture and data modeling principles, excel in communication and stakeholder influencing, be familiar with business intelligence tools like Tableau or Looker, have experience with Google data tools like BigQuery, and exhibit a strong sense of curiosity and passion for data insights. In this role, you will lead a team of data scientists and analysts, oversee the development of data models and algorithms for new product initiatives, provide strategic direction for data science projects aligned with business objectives, collaborate with cross-functional teams to integrate data science solutions, analyze complex datasets for trends and patterns, utilize generative AI techniques, ensure adherence to ITIL V4 practices, mentor team members, monitor project progress, drive continuous improvement in data science methodologies, and foster a culture of innovation and collaboration within the team. To qualify for this role, you should have a solid background in business analysis and data analysis, expertise in generative AI, understanding of ITIL V4 practices, excellent communication and collaboration skills, proficiency in team management, and a commitment to working from the office during day shifts.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 - 0 Lacs

karnataka

On-site

Overview of the Company: 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we assist the world's leading brands in transforming their business challenges into opportunities and shaping the future of work. At 66degrees, the values of embracing challenges and winning together guide us not only in achieving company goals but also in creating a significant impact for our employees. We are dedicated to fostering a culture that sparks innovation and supports professional and personal growth. Overview of Role: We are looking for an experienced Data Architect to design, develop, and maintain our Google Cloud data architecture. The ideal candidate will possess a strong background in data architecture, data engineering, and cloud technologies, with expertise in managing data across Google Cloud platforms. Responsibilities: - GCP Cloud Architecture: Design, implement, and manage robust, scalable, and cost-effective cloud-based data architectures on Google Cloud Platform (GCP), utilizing services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Experience in designing cloud architectures on Oracle Cloud is advantageous. - Data Modeling: Develop and maintain conceptual, logical, and physical data models to support various business needs. - Big Data Processing: Design and implement solutions for processing large datasets using technologies such as Spark and Hadoop. - Data Governance: Establish and enforce data governance policies, including data quality, security, compliance, and metadata management. - Data Pipelines: Build and optimize data pipelines for efficient data ingestion, transformation, and loading. - Performance Optimization: Monitor and tune data systems to ensure high performance and availability. - Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and provide architectural guidance. - Innovation: Stay updated with the latest technologies and trends in data architecture and cloud computing. Qualifications: - GCP Core Services: In-depth knowledge of GCP data services, including BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. - Data Modeling: Expertise in data modeling techniques and best practices. - Big Data Technologies: Hands-on experience with Spark and Hadoop. - Cloud Architecture: Proven ability to design scalable, reliable, and cost-effective cloud architectures. - Data Governance: Understanding of data quality, security, compliance, and metadata management. - Programming: Proficiency in SQL, Python, and DBT (Data Build Tool). - Problem-Solving: Strong analytical and problem-solving skills. - Communication: Excellent written and verbal communication skills. - A Bachelor's degree in Computer Science, Computer Engineering, Data, or related field is required, or equivalent work experience. - GCP Professional Data Engineer or Cloud Architect certification is a plus.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be responsible for utilizing Apex to execute flow and transaction control statements on Salesforce servers, in coordination with calls to the API. With over 8 years of relevant experience, you will leverage Lightning Component, Visualforce, and JavaScript UI frameworks to develop single page applications for both desktop and mobile platforms within the Salesforce application. Additionally, you will be tasked with using various web services such as SOAP API, REST API, Bulk API, and Metadata API to integrate Salesforce with external systems, as well as creating APIs that can be consumed by external applications. Your proficiency in SOQL and SOSL Salesforce database languages will be crucial in conducting searches on Salesforce data through field-based and text-based search queries. Your role will require a strong understanding of code optimization, various design pattern techniques, data modeling, and backend logic using Apex. Furthermore, expertise in Lightning web components, experience with Version control software (GIT, SVN, etc.), and familiarity with working in an agile environment will be essential. The ability to effectively deliver against multiple initiatives simultaneously, prioritize tasks efficiently, and demonstrate excellent written and verbal communication skills, analytical capabilities, and troubleshooting abilities will be key to success in this role. Please note that the project shift timings for this position are from 5PM to 2AM IST.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Experience in developing digital marketing/digital analytics solutions using Adobe products is essential for this role. You should have experience in Adobe Experience Cloud products and recent experience with Adobe Experience Platform or a similar CDP. Good knowledge of the Data Science workspace and building intelligent Services on AEP is required. You should also have a strong understanding of datasets in Adobe Experience Platform, including loading data into the Platform through data source connectors, APIs, and streaming ingestion connectors. Furthermore, experience in creating all required Adobe XDM (Experience Data Model) in JSON based on the approved data model for all loading data files is necessary. Knowledge on utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment is also expected. Additionally, you should have experience in configuration within Adobe Experience Platform for all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases. It is important to be able to test/validate the segments with the required destinations. Managing customer data by using Real-Time Customer Data Platform (RTCDP) and analyzing customer data by using Customer Journey Analytics (CJA) are key responsibilities of this role. You are required to have experience with creating connections, data views, and dashboards in CJA. Hands-on experience in the configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign, and Target is also essential. Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable for this position. Proven ability to communicate verbally and in writing in a high-performance, collaborative environment is expected. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) is also a part of this role. At Capgemini, you can shape your career with a range of career paths and internal opportunities within the Capgemini group. Comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work, are provided. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With over 55 years of heritage, Capgemini is trusted by clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by market-leading capabilities in AI, generative AI, cloud, and data, combined with deep industry expertise and a partner ecosystem.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 week ago

Apply

5.0 - 10.0 years

35 - 45 Lacs

Bengaluru

Hybrid

Expert in the Operating Model & AJG data governance, SOP’s for Collibra, Collibra Data Catalog KPI, manual stitching of assets in Collibra, Technical skills. Workflow Design & Stakeholder Management, Hands on exp in Data Governance & Collibra. Required Candidate profile Implementation, configuration, and maintenance of the Collibra Data Governance Platform Stewards, data owners, stakeholders, data governance, quality & integration principles Collibra Ranger Certified

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

vadodara, gujarat

On-site

As a Backend Developer, you will be responsible for designing and developing complex backend features using the Laravel framework (version 9+) while adhering to SOLID principles and clean architecture patterns. Your role will involve building and maintaining RESTful APIs for mobile applications, web frontends, and external integrations, ensuring proper versioning and backwards compatibility. In addition, you will implement robust queue-based job processing systems using Laravel Queues (Redis/Database) to handle asynchronous operations efficiently. Your responsibilities will also include designing and optimizing complex database schemas with MySQL, focusing on advanced relationship modeling, query optimization, stored procedures, triggers, and materialized view maintenance. It will be crucial for you to create comprehensive automated test suites, including unit tests, integration tests, and feature tests using PHPUnit, to maintain high code coverage and ensure system reliability. Furthermore, you will work on optimizing application performance through caching strategies (Redis), database query optimization, and efficient memory management for high-volume operations. Collaboration with frontend developers to design efficient API contracts and ensure seamless integration between backend services and user interfaces will be an essential part of your role. Troubleshooting and debugging production issues using logging, monitoring tools, and performance profiling to maintain system stability and performance will also be within your scope of work. To succeed in this role, you will need to possess strong analytical thinking skills for analyzing complex business requirements and translating them into scalable technical solutions. Excellent communication skills, both written and verbal, will be necessary for effective collaboration with cross-functional teams, documenting technical decisions, and participating in code reviews. Attention to detail, especially regarding code quality, data integrity, and system reliability, will be crucial. You should also have a good understanding of performance optimization techniques, learning agility to adapt to new technologies and frameworks quickly, and a commitment to writing clean, maintainable, well-documented code with comprehensive test coverage. Experience with third-party API integrations, webhook handling, and building resilient systems that can handle external service failures gracefully will be beneficial. An understanding of data modeling, ETL processes, and efficient handling of large datasets with proper validation and transformation will also be required. Moreover, knowledge of web application security best practices and familiarity with deployment processes, environment management, monitoring, logging, and maintaining production systems are essential for this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are currently recruiting for a Database Engineer to join our software engineering team. As a Database Engineer, you will play a crucial role in developing high-performing, scalable, enterprise-grade data-driven applications. Your responsibilities will include designing and developing high-volume, low-latency applications for mission-critical systems, ensuring high availability and performance. You will contribute to all phases of the development lifecycle, write efficient and testable code, participate in code reviews, and lead team refactoring efforts to enhance processes. To qualify for this position, you should have at least 3 years of experience working as a database engineer or in a related role. You must possess strong SQL expertise and a deep understanding of various database objects such as tables, views, functions, stored procedures, and triggers. Experience in data modeling, data warehousing architecture, SQL server administration, database tuning, ETL processes, and database operations best practices is essential. You should be familiar with troubleshooting potential issues, testing/tracking bugs at the raw data level, and working in an Agile development process using tools like JIRA, Bamboo, and git. Preferred qualifications include a degree in computer science or a related technical field, experience with MySQL and Microsoft SQL Server, and proficiency in Python. Additionally, you should have experience working with stakeholders to gather requirements, handling production systems, and demonstrating a strong desire to learn new technologies. A growth mentality and motivation to become a key member of the team are also important attributes for this role. The job is located in Mumbai and offers free pickup & drop cab and food facilities. If you meet the qualification criteria and are interested in joining our team, please share your updated resume to careers@accesshealthcare.com. For further details, you can contact HR- Rathish at +91-91762-77733. Venue: Access Healthcare Services Pvt Ltd Empire Tower, 14th floor, D wing, Reliable Cloud City, Gut no-31, Thane - Belapur Road, Airoli, Navi Mumbai, Maharashtra 400708. Employment Type: Full Time Role: Group Leader - Business Analyst Industry: BPO, Call Centre, ITES Salary: Best in the industry Function: ITES, BPO, KPO, LPO, Customer Service, Operations Experience: 1 - 4 Years Please note that the responsibilities and qualifications mentioned in the job description are subject to change based on the requirements of the organization.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The Microsoft Cloud Data Engineer role is a great opportunity for a talented and motivated individual to design, construct, and manage cloud-based data solutions using Microsoft Azure technologies. Your primary responsibility will be to create strong, scalable, and secure data pipelines and support analytics workloads that drive business insights and data-based decision-making. You will design and deploy ETL/ELT pipelines using Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake Storage. Additionally, you will be responsible for developing and overseeing data integration workflows to bring in data from various sources such as APIs, on-prem systems, and cloud services. It will also be important to optimize and maintain SQL-based data models, views, and stored procedures in Azure SQL, SQL MI, or Synapse SQL Pools. Collaboration with analysts, data scientists, and business teams will be crucial to gather data requirements and provide reliable and high-quality datasets. You will need to ensure data quality, governance, and security by implementing robust validation, monitoring, and encryption mechanisms. Supporting infrastructure automation using Azure DevOps, ARM templates, or Terraform for resource provisioning and deployment will also be part of your responsibilities. You will also play a role in troubleshooting, performance tuning, and the continuous improvement of the data platform. To qualify for this position, you should have a Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. A minimum of 3 years of experience in data engineering with a focus on Microsoft Azure data services is required. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake is a must. Strong proficiency in SQL and data modeling is essential, along with experience in Python, PySpark, or .NET for data processing. Understanding of data warehousing, data lakes, and ETL/ELT best practices is important, as well as familiarity with DevOps tools and practices in an Azure environment. Knowledge of Power BI or similar visualization tools is also beneficial. Additionally, holding the Microsoft Certified: Azure Data Engineer Associate certification or its equivalent is preferred.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for an experienced and dedicated Senior Manager of Business Intelligence & Data Engineering to lead a team of engineers. In this role, you will oversee various aspects of the Business Intelligence (BI) ecosystem, including designing and maintaining data pipelines, enabling advanced analytics, and providing actionable insights through BI tools and data visualization. Your responsibilities will include leading the design and development of scalable data architectures on AWS, managing Data Lakes, implementing data modeling and productization, collaborating with business stakeholders to create actionable insights, ensuring thorough documentation of data pipelines and systems, promoting knowledge-sharing within the team, and staying updated on industry trends in data engineering and BI. You should have at least 10 years of experience in Data Engineering or a related field, with a strong track record in designing and implementing large-scale distributed data systems. Additionally, you should possess expertise in BI, data visualization, people management, CI/CD tools, cloud-based data warehousing, AWS services, Data Lake architectures, Apache Spark, SQL, enterprise BI platforms, and microservices-based architectures. Strong communication skills, a collaborative mindset, and the ability to deliver insights to technical and executive audiences are essential for this role. Bonus points will be awarded if you have knowledge of data science and machine learning concepts, experience with Infrastructure as Code practices, familiarity with data governance and security in cloud environments, and domain understanding of Apparel, Retail, Manufacturing, Supply Chain, or Logistics. If you are passionate about leading a high-performing team, driving innovation in data engineering and BI, and contributing to the success of a global sports platform like Fanatics Commerce, we welcome you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Databricks on AWS and PySpark Engineer being sought to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and optimizing data processing workflows with PySpark. Collaboration with data scientists and analysts to develop data models and ensure data quality, security, and compliance with industry standards will also be a key responsibility. Your main tasks will include troubleshooting data pipeline issues, optimizing performance, and staying updated on industry trends and emerging data engineering technologies. You should have at least 3 years of experience in data engineering with a focus on Databricks on AWS and PySpark, possess strong expertise in PySpark and Databricks for data processing, modeling, and warehousing, and have hands-on experience with AWS services like S3, Glue, and IAM. Your proficiency in data engineering principles, data governance, and data security is essential, along with experience in managing data processing workflows and data pipelines. Strong problem-solving skills, attention to detail, effective communication, and collaboration abilities are key soft skills required for this role, as well as the capability to work in a fast-paced and dynamic environment while adapting to changing requirements and priorities.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies