Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Summary Position Summary AWS DevSecOps Engineer – CL4 Role Overview : As a DevSecOps Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive DevSecOps engineering craftsmanship and advanced proficiency across multiple programming languages, DevSecOps tools, and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused CI/CD and automation solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop DevSecOps engineering solutions that solve complex automation problems with valuable outcomes, ensuring high-quality, lean, resilient and secure pipelines with low operating costs, meeting platform/technology KPIs. Technical Leadership and Advocacy: Serve as the technical advocate for DevSecOps modern practices, ensuring integrity, feasibility, and alignment with business and customer goals, NFRs, and applicable automation/integration/security practices—being responsible for designing and maintaining code repos, CI/CD pipelines, integrations (code quality, QE automation, security, etc.) and environments (sandboxes, dev, test, stage, production) through IaC, both for custom and package solutions, including identifying, assessing, and remediating vulnerabilities. Engineering Craftsmanship: Maintain accountability for the integrity and design of DevSecOps pipelines and environments while leading the implementation of deployment techniques like Blue-Green, Canary to minimize down-time and enable A/B testing. Be always hands-on and actively engage with engineers to ensure DevSecOps practices are understood and can be implemented throughout the product development life cycle. Resolve any technical issues from implementation to production operations (e.g., leading triage and troubleshooting production issues). Be self-driven to learn new technologies, experiment with engineers, and inspire the team to learn and drive application of those new technologies. Customer-Centric Engineering: Develop lean, and yet scalable and flexible, DevSecOps automations through rapid, inexpensive experimentation to solve customer needs, enabling version control, security, logging, feedback loops, continuous delivery, etc. Engage with customers and product teams to deliver the right automation, security, and deployment practices. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a leaning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, engineering, delivery, infrastructure, and security. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Support a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess intermediary knowledge in modern software engineering practices and principles, including Agile methodologies, DevSecOps, Continuous Integration/Continuous Deployment. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery, ensuring high-quality outcomes with minimal waste. Demonstrate intermediate level understanding of the product development lifecycle, from conceptualization and design to implementation and scaling, with a focus on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs into technical requirements and automations. Learn to navigate various enterprise functions such as product, experience, engineering, compliance, and security to drive product value and feasibility. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating technical concepts clearly and compellingly. Support teammates and product teams through well-structured arguments and trade-offs supported by evidence, evaluations, and research. Learn to create a coherent narrative that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Able to engage and collaborate with product engineering teams, including customers as needed. Able to build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Support diverse perspectives and consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes by leveraging a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Strong software engineering foundation with deep understanding of OOP/OOD, functional programming, data structures and algorithms, software design patterns, code instrumentations, etc. 5+ years proven experience with Python, Bash, PowerShell, JavaScript, C#, and Golang (preferred). 5+ years proven experience with CI/CD tools (Azure DevOps and GitHub Enterprise) and Git (version control, branching, merging, handling pull requests) to automate build, test, and deployment processes. 5+ years of hands-on experience in security tools automation SAST/DAST (SonarQube, Fortify, Mend), monitoring/logging (Prometheus, Grafana, Dynatrace), and other cloud-native tools on AWS, Azure, and GCP. 5+ years of hands-on experience in using Infrastructure as Code (IaC) technologies like Terraform, Puppet, Azure Resource Manager (ARM), AWS Cloud Formation, and Google Cloud Deployment Manager. 2+ years of hands-on experience with cloud native services like Data Lakes, CDN, API Gateways, Managed PaaS, Security, etc. on multiple cloud providers like AWS, Azure and GCP is preferred. Strong understanding of methodologies like, XP, Lean, SAFe to deliver high quality products rapidly. General understanding of cloud providers security practices, database technologies and maintenance (e.g. RDS, DynamoDB, Redshift, Aurora, Azure SQL, Google Cloud SQL) General knowledge of networking, firewalls, and load balancers. Strong preference will be given to candidates with AI/ML and GenAI. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 211454 Show more Show less
Posted 2 weeks ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description The Opportunity Manager, Data Visualization Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a Manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What Will You Do In This Role Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should You Have 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Not Applicable Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 06/15/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R336575 Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Specialist - Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R334900 Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description Job Title: Data Engineer – AWS & PySpark Location: Trivandrum/Kochi/Chennai/Hyderabad/Bangalore/Pune/Noida Experience Required: 5–8 years Employment Type: Full-Time Department: Data Engineering / Technology Job Summary We are seeking a skilled Data Engineer with 5–8 years of experience to design, implement, and maintain robust, scalable data architectures on AWS . The ideal candidate will be highly proficient in PySpark and experienced with AWS cloud data services including S3, Glue, and Redshift . You will work closely with cross-functional teams to enable seamless data flows and ensure efficient ETL pipelines across our cloud infrastructure. Key Responsibilities Design, implement, and maintain scalable and robust data architectures on AWS. Utilize AWS services such as S3, Glue, and Redshift for data storage, processing, and analytics. Develop and implement ETL processes using PySpark to extract, transform, and load data into AWS storage solutions. Collaborate with cross-functional teams to ensure seamless data flow across different AWS services. Ensure high-quality code and adherence to best practices in data engineering and cloud deployment. Must-Have Skills 5–8 years of experience in Data Engineering. Strong hands-on experience with PySpark for ETL and data processing tasks. Proven experience working with AWS data services: S3, Glue, and Redshift. Solid understanding of data modeling, SQL, and database design. Experience with designing and building data pipelines on AWS. Good-to-Have Skills Knowledge of Airflow, Terraform, or other orchestration/IaC tools. Familiarity with DevOps practices in cloud-based data environments. Experience with real-time data processing (e.g., Kinesis, Kafka). Exposure to data governance, security, and compliance frameworks. Working knowledge of other AWS services like Lambda, Athena, or EMR. Educational Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Skills Aws,Pyspark,Aws Cloud Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Lead Analytics Engineer We are seeking a talented, motivated and self-driven professional to join the HH Digital, Data & Analytics (HHDDA) organization and play an active role in Human Health transformation journey to become the premier “Data First” commercial biopharma organization. As a Lead Analytics Engineer, you will be part of the HHDDA Commercial Data Solutions team, providing technical/data expertise development of analytical data products to enable data science & analytics use cases. In this role, you will create and maintain data assets/domains used in the commercial/marketing analytics space – to develop best-in-class data pipelines and products, working closely with data product owners to translate data product requirements and user stories into development activities throughout all phases of design, planning, execution, testing, deployment and delivery. Your Specific Responsibilities Will Include Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field. Required Experience 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science and visualization/reporting products, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI) Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Design, Data Engineering, Data Modeling, Data Science, Data Visualization, Machine Learning, Software Development, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 05/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R310032 Show more Show less
Posted 2 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Senior Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Senior Specialist- Data Visualization, you will be leading a team of Data Visualization and Analytics experts who are focused on designing and developing compelling analytical solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in team leadership, stakeholder management, product management while leading the development of user-centric visualization products that empower stakeholders with data driven insights & decision-making capability. Responsibilities Lead and manage a team of ~10 visualization experts, designers, data analysts, and insights specialists, fostering a high-performing, inclusive, and engaged work environment. Drive team-building initiatives, mentorship, coaching, and performance management while prioritizing diversity, inclusion, and well-being. Develop user-centric analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Foster a culture of innovation and continuous learning, encouraging the adoption of new visualization tools, methodologies, and a product mindset to build scalable and reusable solutions. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Required Experience And Skills 10+ years of experience in insight generation, business analytics, business intelligence, and interactive visual storytelling, with a strong focus on infographics and data-driven decision-making. Proven leadership and people management skills, including team development, mentoring, and fostering a high-performing, engaged workforce. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Hands-on expertise in BI and visualization tools such as Qlik, Power BI, MicroStrategy, Looker, and ThoughtSpot, with proficiency in PowerPoint and data storytelling for creating impactful presentations. Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Proven ability to manage third-party vendors and contractors, ensuring high-quality deliverables and cost-effective solutions. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R334763 Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335129 Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Key Responsibilities Design, build, and maintain robust ETL/ELT pipelines and data workflows. Develop and optimize data architectures for performance and scalability. Implement data quality and validation checks to ensure reliable data delivery. Collaborate with data analysts, scientists, and stakeholders to understand data needs. Integrate data from various sources including APIs, databases, cloud platforms, and third-party providers. Maintain and enhance data lake/data warehouse environments (e.g., AWS Redshift, Google BigQuery, Snowflake). Ensure data privacy and security compliance with internal and external regulations (e.g., GDPR, HIPAA). Monitor and troubleshoot data pipeline failures and performance bottlenecks. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field. 8+ years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and data modeling. Experience with data pipeline tools (e.g., Apache Airflow, dbt, Luigi). Proficient in Python, Scala, or Java. Familiarity with cloud platforms such as AWS, GCP, or Azure. Experience with data warehouses and big data technologies (e.g., Spark, Hive, Kafka). Understanding of data governance and best practices for data quality. Skills: kafka,data architectures,data,scala,data modeling,dbt,data privacy,data quality,java,spark,aws redshift,cloud platforms,data lake,hive,gcp,apis,data validation,data workflows,big data,azure,apache airflow,data security,aws,data warehouse,data governance,snowflake,google bigquery,cloud,sql,luigi,apache,databases,etl,python,elt Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Delhi, India
On-site
Key Responsibilities Design, build, and maintain robust ETL/ELT pipelines and data workflows. Develop and optimize data architectures for performance and scalability. Implement data quality and validation checks to ensure reliable data delivery. Collaborate with data analysts, scientists, and stakeholders to understand data needs. Integrate data from various sources including APIs, databases, cloud platforms, and third-party providers. Maintain and enhance data lake/data warehouse environments (e.g., AWS Redshift, Google BigQuery, Snowflake). Ensure data privacy and security compliance with internal and external regulations (e.g., GDPR, HIPAA). Monitor and troubleshoot data pipeline failures and performance bottlenecks. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field. 8+ years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and data modeling. Experience with data pipeline tools (e.g., Apache Airflow, dbt, Luigi). Proficient in Python, Scala, or Java. Familiarity with cloud platforms such as AWS, GCP, or Azure. Experience with data warehouses and big data technologies (e.g., Spark, Hive, Kafka). Understanding of data governance and best practices for data quality. Skills: kafka,data architectures,data,scala,data modeling,dbt,data privacy,data quality,java,spark,aws redshift,cloud platforms,data lake,hive,gcp,apis,data validation,data workflows,big data,azure,apache airflow,data security,aws,data warehouse,data governance,snowflake,google bigquery,cloud,sql,luigi,apache,databases,etl,python,elt Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities Design, build, and maintain robust ETL/ELT pipelines and data workflows. Develop and optimize data architectures for performance and scalability. Implement data quality and validation checks to ensure reliable data delivery. Collaborate with data analysts, scientists, and stakeholders to understand data needs. Integrate data from various sources including APIs, databases, cloud platforms, and third-party providers. Maintain and enhance data lake/data warehouse environments (e.g., AWS Redshift, Google BigQuery, Snowflake). Ensure data privacy and security compliance with internal and external regulations (e.g., GDPR, HIPAA). Monitor and troubleshoot data pipeline failures and performance bottlenecks. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field. 5+ years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and data modeling. Experience with data pipeline tools (e.g., Apache Airflow, dbt, Luigi). Proficient in Python, Scala, or Java. Familiarity with cloud platforms such as AWS, GCP, or Azure. Experience with data warehouses and big data technologies (e.g., Spark, Hive, Kafka). Understanding of data governance and best practices for data quality. Skills: sql,data engineering,luigi,spark,gcp,azure,apache airflow,java,data modeling,kafka,scala,dbt,aws,python,hive Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities a. Analyze and solve problems at their root, stepping back to understand the broader context Establish , meet , and monitor SLAs for support issues in conjunction with the rest of the teams b. Interface with customers , understand their requirements and deliver complete application and data solutions c. Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment d. Address and effectively manage sensitive issues and manage escalations from business teams e. Build and maintain effective internal relationships , specifically with Engineering , Site Reliability Engineering, Client Support Managers, and Ad Operations to help identify, report, and resolve issues quickly f. Learn and understand a broad range of Samsung Ads Platform and applications and know when , how, and which to use and which not to use Qualifications 4+ yrs Ad-Tech domain experience 1+ yrs experience with SQL & databases Strong written and verbal communication skills Customer centric mindset and structured approach to troubleshooting issue resolution Demonstrated experience of solving technical issues with third party SSPs , MMPs , SSAI vendors , Measurement solutions , Fraud Prevention companies etc Experience with 3P measurement vendor troubleshooting and integration Experience with support platforms tools , CRM or ticketing tools Ability to present complex technical information in a clear and concise manner to a variety of audiences , especially non-technical / go-to-market teams Experience using cloud-based tools like S3, Athena , QuickSights, and Kibana is a plus Experience as a Business/Product/Data Analyst is a plus Experience with Redshift and Snowflake is a plus Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
We're Hiring: Data Engineering Intern Remote | Paid Internship Opportunity Duration: 3 Months Start Date: Immediate / As per availability Stipend: Competitive About SolvusAI SolvusAI is a fast-growing AI consulting and solutioning firm helping global enterprises unlock the full potential of Generative AI and Machine Learning. We partner with clients across industries to craft and implement high-impact AI strategies that solve real business problems and drive tangible value. We thrive on curiosity, ownership, and adaptability. Whether it’s rapid prototyping, cloud-native engineering, or cutting-edge LLM deployments — we move fast and build smart. Join us in shaping the future of enterprise AI. What You’ll Do – Role Overview As a Data Engineering Intern , you’ll be part of a dynamic team building scalable data infrastructure to power AI solutions. You’ll work hands-on with real datasets, contribute to robust cloud-native pipelines, and gain exposure to integrating AI/ML into data workflows. This is your chance to turn raw data into meaningful business intelligence. Key Responsibilities 🔹 Design and develop ETL pipelines for structured and unstructured data 🔹 Work with big data tools like Apache Spark for data transformation 🔹 Integrate datasets across cloud data warehouses (Snowflake, Redshift, BigQuery) and data lakes (S3, Azure Data Lake) 🔹 Write optimized SQL for data preparation and extraction 🔹 Use Python (Pandas, NumPy) for data wrangling and scripting 🔹 Containerize workflows using Docker 🔹 Interface with RESTful APIs and handle JSON-based data 🔹 Contribute to cloud-based environments (AWS, GCP, Azure) 🔹 Collaborate with cross-functional teams to understand business needs and craft data solutions 🔹 Support AI/ML driven data workflows and contribute to projects involving ML models (Preferred) What We’re Looking For Bachelor of Technology/Engineering in any discipline Strong knowledge of Python, SQL, and data structures Proactive, curious mindset with a problem-solving attitude Eagerness to learn and apply data engineering in business contexts Familiarity with AI/ML concepts and their applications in data pipelines (Preferred) What You’ll Gain - Hands-on experience solving real-world business problems - Exposure to modern data stacks and cloud platforms - Mentorship from seasoned AI engineers and consultants - A chance to see how technical solutions drive business outcomes How to Apply Ready to launch your career in data engineering with a forward-thinking AI team? Apply here on LinkedIn. Additionally, you could email your resume along with a short note on why you’re excited about this role to careers@solvusai.com Let’s build the future of AI & Data, together. Show more Show less
Posted 2 weeks ago
3.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We’re seeking a skilled Data Scientist with expertise in SQL, Python, AWS SageMaker , and Commercial Analytics to contribute to Team. You’ll design predictive models, uncover actionable insights, and deploy scalable solutions to recommend optimal customer interactions. This role is ideal for a problem-solver passionate about turning data into strategic value. Key Responsibilities Model Development: Build, validate, and deploy machine learning models (e.g., recommendation engines, propensity models) using Python and AWS SageMaker to drive next-best-action decisions. Data Pipeline Design: Develop efficient SQL queries and ETL pipelines to process large-scale commercial datasets (e.g., customer behavior, transactional data). Commercial Analytics: Analyze customer segmentation, lifetime value (CLV), and campaign performance to identify high-impact NBA opportunities. Cross-functional Collaboration: Partner with marketing, sales, and product teams to align models with business objectives and operational workflows. Cloud Integration: Optimize model deployment on AWS, ensuring scalability, monitoring, and performance tuning. Insight Communication: Translate technical outcomes into actionable recommendations for non-technical stakeholders through visualizations and presentations. Continuous Improvement: Stay updated on advancements in AI/ML, cloud technologies, and commercial analytics trends. Qualifications Education: Bachelor’s/Master’s in Data Science, Computer Science, Statistics, or a related field. Experience: 3-4 years in data science, with a focus on commercial/customer analytics (e.g., pharma, retail, healthcare, e-commerce, or B2B sectors). Technical Skills: Proficiency in SQL (complex queries, optimization) and Python (Pandas, NumPy, Scikit-learn). Hands-on experience with AWS SageMaker (model training, deployment) and cloud services (S3, Lambda, EC2). Familiarity with ML frameworks (XGBoost, TensorFlow/PyTorch) and A/B testing methodologies. Analytical Mindset: Strong problem-solving skills with the ability to derive insights from ambiguous data. Communication: Ability to articulate technical concepts to business stakeholders. Preferred Qualifications AWS Certified Machine Learning Specialty or similar certifications. Experience with big data tools (Spark, Redshift) or ML Ops practices. Knowledge of NLP, reinforcement learning, or real-time recommendation systems. Exposure to BI tools (Tableau, Power BI) for dashboarding. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Data Analytics Engineer at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Analytics Engineer you should have experience with: Programming experience in Java or Python Knowledge of RDBS/NoSQL databases and Data Ware house concepts Familiarity with AWS Services (Glue, Redshift, Lambda etc.) Familiarity with batch process, Stream Processing and data integration techniques using big data framework Spark Some Other Highly Valued Skills May Include Knowledge of data visualization tools like Tableau, SAP BO Experience in containerization using docker and Kubernetes Familiarity with Serverless computing platform You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: Data Intelligence Engineer The Team: The team is responsible for building, maintaining, and evolving the data intelligence architecture, data pipelines, and visualizations. It collaborates with business partners and senior management, working within multi-functional agile teams to ensure data integrity, lineage, and security. The team values self-service, automation, and leveraging data to drive insights and improvements. The Impact: This role is pivotal in transforming raw data into actionable insights that improve productivity, reduce operational risks, and identify business opportunities. By designing and implementing robust data solutions and visualizations, the Data Intelligence Engineer directly supports data-driven decision-making across various levels of the organization. The position contributes to extracting tangible value from data assets, ultimately enhancing overall service performance and business outcomes. What’s In It For You Opportunity to design, build, and maintain a scalable, flexible, and robust data intelligence architecture, staying current with evolving technology trends. Engage in creative data science and analysis to provide actionable insights that directly influence business productivity and risk reduction strategies. Work in a dynamic environment focused on self-service and automation, with opportunities to utilize and expand knowledge in cloud environments (AWS, Azure, GCP). Collaborate within multi-functional agile teams, contributing to data-driven development and enhancing your skills in a supportive setting. Responsibilities Build and maintain the data intelligence architecture, ensuring it is scalable, flexible, robust, and cost-conscious. Design, build, and maintain efficient Data Pipelines, focusing on loose coupling, data integrity, and lineage. Develop Data Visualizations with a focus on data security, self-service capabilities, and intelligible temporal metrics to highlight risks and opportunities. Conduct creative data science and analysis to provide actionable insights aimed at improving productivity and reducing risk. Work with business partners to identify how value can be extracted from data, emphasizing self-service and automation. Define, measure, and maintain key performance metrics, statistics for senior management, customer stats, business trend analysis, and overall service statistics. What We’re Looking For Key Qualifications: Bachelor’s degree required, with an overall experience of 4-8 years, including 3-4 years in Data Intelligence and 2-3 years in Development & Support. Strong experience in Python or other scripting languages (e.g., Shell, PowerShell) and strong SQL skills with experience in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, DynamoDB, Redshift). Minimum 3+ years’ experience in development/automation areas, including automating data ingestion, transformation, and aggregation, and working knowledge of cloud technologies like AWS, Azure, or GCP (including Blob/flat file processing). Experience with Power BI or Tableau, including designing dashboards with trending visuals. Good to have knowledge of DAX, Power BI service, dataset refreshes, and performance optimization tools. Soft Skills Strong communication skills to effectively interact with both technical and non-technical teammates and stakeholders. Proven ability to work independently and collaborate effectively in multi-functional agile teams. Strong problem-solving and analytical skills with an understanding of agile software development processes and data-driven development. A thorough understanding of the software development life cycle and agile techniques is beneficial. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316489 Posted On: 2025-05-31 Location: Gurgaon, Haryana, India Show more Show less
Posted 2 weeks ago
0.0 - 3.0 years
0 Lacs
Gurugram, Haryana
On-site
About the Role: Grade Level (for internal use): 09 The Role: Data Intelligence Engineer The Team: The team is responsible for building, maintaining, and evolving the data intelligence architecture, data pipelines, and visualizations. It collaborates with business partners and senior management, working within multi-functional agile teams to ensure data integrity, lineage, and security. The team values self-service, automation, and leveraging data to drive insights and improvements. The Impact: This role is pivotal in transforming raw data into actionable insights that improve productivity, reduce operational risks, and identify business opportunities. By designing and implementing robust data solutions and visualizations, the Data Intelligence Engineer directly supports data-driven decision-making across various levels of the organization. The position contributes to extracting tangible value from data assets, ultimately enhancing overall service performance and business outcomes. What’s in it for you: Opportunity to design, build, and maintain a scalable, flexible, and robust data intelligence architecture, staying current with evolving technology trends. Engage in creative data science and analysis to provide actionable insights that directly influence business productivity and risk reduction strategies. Work in a dynamic environment focused on self-service and automation, with opportunities to utilize and expand knowledge in cloud environments (AWS, Azure, GCP). Collaborate within multi-functional agile teams, contributing to data-driven development and enhancing your skills in a supportive setting. Responsibilities: Build and maintain the data intelligence architecture, ensuring it is scalable, flexible, robust, and cost-conscious. Design, build, and maintain efficient Data Pipelines, focusing on loose coupling, data integrity, and lineage. Develop Data Visualizations with a focus on data security, self-service capabilities, and intelligible temporal metrics to highlight risks and opportunities. Conduct creative data science and analysis to provide actionable insights aimed at improving productivity and reducing risk. Work with business partners to identify how value can be extracted from data, emphasizing self-service and automation. Define, measure, and maintain key performance metrics, statistics for senior management, customer stats, business trend analysis, and overall service statistics. What We’re Looking For: Key Qualifications: Bachelor’s degree required, with an overall experience of 4-8 years, including 3-4 years in Data Intelligence and 2-3 years in Development & Support. Strong experience in Python or other scripting languages (e.g., Shell, PowerShell) and strong SQL skills with experience in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, DynamoDB, Redshift). Minimum 3+ years’ experience in development/automation areas, including automating data ingestion, transformation, and aggregation, and working knowledge of cloud technologies like AWS, Azure, or GCP (including Blob/flat file processing). Experience with Power BI or Tableau, including designing dashboards with trending visuals. Good to have knowledge of DAX, Power BI service, dataset refreshes, and performance optimization tools. Soft Skills: Strong communication skills to effectively interact with both technical and non-technical teammates and stakeholders. Proven ability to work independently and collaborate effectively in multi-functional agile teams. Strong problem-solving and analytical skills with an understanding of agile software development processes and data-driven development. A thorough understanding of the software development life cycle and agile techniques is beneficial. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316489 Posted On: 2025-05-31 Location: Gurgaon, Haryana, India
Posted 2 weeks ago
0.0 - 3.0 years
0 Lacs
Gurugram, Haryana
On-site
Engineer, Data Intelligence Gurgaon, India Information Technology 316489 Job Description About The Role: Grade Level (for internal use): 09 The Role: Data Intelligence Engineer The Team: The team is responsible for building, maintaining, and evolving the data intelligence architecture, data pipelines, and visualizations. It collaborates with business partners and senior management, working within multi-functional agile teams to ensure data integrity, lineage, and security. The team values self-service, automation, and leveraging data to drive insights and improvements. The Impact: This role is pivotal in transforming raw data into actionable insights that improve productivity, reduce operational risks, and identify business opportunities. By designing and implementing robust data solutions and visualizations, the Data Intelligence Engineer directly supports data-driven decision-making across various levels of the organization. The position contributes to extracting tangible value from data assets, ultimately enhancing overall service performance and business outcomes. What’s in it for you: Opportunity to design, build, and maintain a scalable, flexible, and robust data intelligence architecture, staying current with evolving technology trends. Engage in creative data science and analysis to provide actionable insights that directly influence business productivity and risk reduction strategies. Work in a dynamic environment focused on self-service and automation, with opportunities to utilize and expand knowledge in cloud environments (AWS, Azure, GCP). Collaborate within multi-functional agile teams, contributing to data-driven development and enhancing your skills in a supportive setting. Responsibilities: Build and maintain the data intelligence architecture, ensuring it is scalable, flexible, robust, and cost-conscious. Design, build, and maintain efficient Data Pipelines, focusing on loose coupling, data integrity, and lineage. Develop Data Visualizations with a focus on data security, self-service capabilities, and intelligible temporal metrics to highlight risks and opportunities. Conduct creative data science and analysis to provide actionable insights aimed at improving productivity and reducing risk. Work with business partners to identify how value can be extracted from data, emphasizing self-service and automation. Define, measure, and maintain key performance metrics, statistics for senior management, customer stats, business trend analysis, and overall service statistics. What We’re Looking For: Key Qualifications: Bachelor’s degree required, with an overall experience of 4-8 years, including 3-4 years in Data Intelligence and 2-3 years in Development & Support. Strong experience in Python or other scripting languages (e.g., Shell, PowerShell) and strong SQL skills with experience in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, DynamoDB, Redshift). Minimum 3+ years’ experience in development/automation areas, including automating data ingestion, transformation, and aggregation, and working knowledge of cloud technologies like AWS, Azure, or GCP (including Blob/flat file processing). Experience with Power BI or Tableau, including designing dashboards with trending visuals. Good to have knowledge of DAX, Power BI service, dataset refreshes, and performance optimization tools. Soft Skills: Strong communication skills to effectively interact with both technical and non-technical teammates and stakeholders. Proven ability to work independently and collaborate effectively in multi-functional agile teams. Strong problem-solving and analytical skills with an understanding of agile software development processes and data-driven development. A thorough understanding of the software development life cycle and agile techniques is beneficial. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316489 Posted On: 2025-05-31 Location: Gurgaon, Haryana, India
Posted 2 weeks ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Department: Tech - Software Development Location: Gurugram, India Description Based in Gurugram, you will work in a new and growing Software Development team extending the current Software Development team based in Oxford UK. As a Software Engineer at Aurora, you will be responsible for turning feature and product ideas that will shape the future of the global energy markets into a reality. You would work as part of a team on a cutting-edge Microservices architecture and using TypeScript/Express/AWS Lambdas, Redshift, MySQL, MongoDB and Micro-frontends built on React. This will allow you to take responsibility for solutions from design to deployment. You will be working with processes and tooling that allow you to release changes to our customers multiple times per day. The successful applicant will combine exceptional problem solving and technical capability with a passion to deliver great solutions for our users. Key Responsibilities Design, develop, test, and operate the new generation of Aurora’s software-as-a service solutions Work closely with end users (internal and external) to innovate highly effective solutions Contribute to continuously improving how the Software Team works Skills, Knowledge and Expertise Three years or more commercial experience in developing complex software solutions with some of the following Node/TypeScript, Express, Python, SQL, NoSQL, React, Cloud Infrastructure, unit testing A proven track record for delivering great software and solving difficult technical problems Experience building web-services/microservices Exceptional problem-solving skills Strong interpersonal skills, and a great team player Benefits A fun, informal, collaborative and international work culture A competitive salary package Access to regular coaching and mentoring sessions and the opportunity to learn from experienced professionals Access to the Aurora Academy, our training programme offering a range of opportunities to develop your skills within the responsibilities of your role and within the wider context of the industry. Access to our Employee Assistance Programme (EAP), offering a complete support network that offers expert advice and compassionate guidance 24/7/365, covering a wide range of personal and professional aspects Show more Show less
Posted 2 weeks ago
6.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary As an AWS Data Engineer at Deqode, youll be responsible for designing and building scalable data pipelines and ETL workflows using AWS Glue, Redshift, and other cloud data services. Your role will involve working with large datasets, automating data processes with Python/PySpark, and ensuring data quality and performance across various Responsibilities : Develop, optimize, and maintain ETL pipelines using AWS Glue and PySpark Design and implement robust data workflows and processing systems Work with large structured and semi-structured datasets to extract insights and support data-driven decision making Optimize Amazon Redshift queries and manage data storage on AWS Collaborate with data analysts, data scientists, and other stakeholders to meet data requirements Automate data workflows using scripts and schedule processes for seamless operations Ensure data quality, consistency, and system performance while troubleshooting production Skills & Qualifications : 6 to 8 years of experience as a Data Engineer Strong hands-on experience with Python or PySpark Proficiency in AWS Glue, Redshift, S3, and other AWS services Solid expertise in writing optimized SQL queries and managing large datasets Experience in building and maintaining scalable ETL processes Excellent problem-solving skills and effective communication Skills : Familiarity with workflow orchestration tools like Airflow or AWS Step Functions Experience with CI/CD processes for data pipelines Knowledge of data governance and security best practices on AWS. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Data Analytics Engineer at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Analytics Engineer you should have experience with: Programming experience in Java or Python Knowledge of RDBS/NoSQL databases and Data Ware house concepts Familiarity with AWS Services (Glue, Redshift, Lambda etc.) Familiarity with batch process, Stream Processing and data integration techniques using big data framework Spark Some Other Highly Valued Skills May Include Knowledge of data visualization tools like Tableau, SAP BO Experience in containerization using docker and Kubernetes Familiarity with Serverless computing platform You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.
The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.
In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect
Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming
As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2