Home
Jobs

1802 Redshift Jobs - Page 30

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Excellent in writing SQL Scripts to validate data in ODS and Mart. Should be able to develop and execute test plans, test cases, test scripts. Strong in validation of Visual Analytics reports using tools like Power BI, MicroStrategy, etc. and running different types of test within the same. Identifies, manages, and resolves defects during testing cycles leveraging a test management tool like JIRA, HP ALM. Also, supports or leads UAT process, when appropriate. Should be aware of Quality and Testing related best practices and procedures and adheres to those. Good knowledge in any RDBMS like AWS Redshift Postgres Oracle SQL Server. Nice to have Python Excellent communication skills both verbal and written. Knowledge of Automation and Automation framework within ETL space. Primary Skills Must have Understanding of data model, ETL architecture. Excellent in Data Warehouse concepts Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Skills : AWS Lead / Solution Architect Experience : 12 - 22 Years Location : Kolkata Job Summary: • 15+ years of hands on IT experience in design and development of complex system • Minimum of 5+ years in a solution or technical architect role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms • At least 4+ years of experience hands on experience in cloud native architecture design, implementation of distributed, fault tolerant enterprise applications for Cloud. • Experience in application migration to AWS cloud using Refactoring, Rearchitecting and Re-platforming approach • 3+ Proven experience using AWS services in architecting PaaS solutions. • AWS Certified Architect Technical Skills • Deep understanding of Cloud Native and Microservices fundamentals • Deep understanding of Gen AI usage and LLM Models, Hands on experience creating Agentic Flows using AWS Bedrock, Hands on experience using Amazon Q for Dev/Transform • Deep knowledge and understanding of AWS PaaS and IaaS features • Hands on experience in AWS services i.e. EC2, ECS, S3, Aurora DB, DynamoDB, Lambda, SQS, SNS, RDS, API gateway, VPC, Route 53, Kinesis, cloud front, Cloud Watch, AWS SDK/CLI etc. • Strong experience in designing and implementing core services like VPC, S3, EC2, RDS, IAM, Route 53, Autoscaling , Cloudwatch, AWS Config, Cloudtrail, ELB, AWS Migration services, ELB, VPN/Direct connect • Hands on experience in enabling Cloud PaaS app and data services like Lambda, RDS, SQS, MQ,, Step Functions, App flow, SNS, EMR, Kinesis, Redshift, Elastic Search and others • Experience automation and provisioning of cloud environments using API’s, CLI and scripts. • Experience in deploy, manage and scale applications using Cloud Formation/ AWS CLI • Good understanding of AWS Security best practices and Well Architecture Framework • Good knowledge on migrating on premise applications to AWS IaaS • Good knowledge of AWS IaaS (AMI, Pricing Model, VPC, Subnets etc.) • Good to have experience in Cloud Data processing and migration, advanced analytics AWS Redshift, Glue, AWS EMR, AWS Kinesis, Step functions • Creating, deploying, configuring and scaling applications on AWS PaaS • Experience in java programming languages Spring, Spring boot, Spring MVC, Spring Security and multi-threading programming • Experience in working with hibernate or other ORM technologies along with JPA • Experience in working on modern web technologies such as Angular, Bootstrap, HTML5, CSS3, React • Experience in modernization of legacy applications to modern java applications • Experience in DevOps tool Jenkins/Bamboo, Git, Maven/Gradle, Jira, SonarQube, Junit, Selenium, Automated deployments and containerization • Knowledge on relational database and no SQL databases i.e. MongoDB, Cassandra etc. • Hands on experience with Linux operating system • Experience in full life-cycle agile software development • Strong analytical & troubleshooting skills • Experienced in Python, Node and Express JS (Optional) Main Duties: • AWS architect takes company’s business strategy and outlines the technology systems architecture that will be needed to support that strategy. • Responsible for analysis, evaluation and development of enterprise long term cloud strategic and operating plans to ensure that the EA objectives are consistent with the enterprise’s long-term business objectives. • Responsible for the development of architecture blueprints for related systems • Responsible for recommendation on Cloud architecture strategies, processes and methodologies. • Involved in design and implementation of best fit solution with respect to Azure and multi-cloud ecosystem • Recommends and participates in activities related to the design, development and maintenance of the Enterprise Architecture (EA). • Conducts and/or actively participates in meetings related to the designated project/s • Participate in Client pursuits and be responsible for technical solution • Shares best practices, lessons learned and constantly updates the technical system architecture requirements based on changing technologies, and knowledge related to recent, current and upcoming vendor products and solutions. • Collaborates with all relevant parties in order to review the objectives and constraints of each solution and determine conformance with the EA. Recommends the most suitable technical architecture and defines the solution at a high level. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: Senior Data Engineer Number of Open Roles: 1 Location: Noida Experience: 5+ years About the Company & Role: We are one of India’s foremost political consulting firms, leveraging the power of data to drive impactful, 360-degree election campaigns. Our unique approach brings together ground intelligence, data engineering, and strategic insight to shape electoral narratives and legislative landscapes. As a Senior Data Engineer, you will play a key leadership role in building robust, scalable, and high-performance data architectures that power our analytics and campaign strategies. This is an opportunity to drive large-scale data initiatives, mentor junior engineers, and work closely with cross-functional teams to build systems that influence real-world democratic outcomes. What You'll Do: ● Architect, design, and manage scalable data pipelines for structured and unstructured data. ● Build and maintain data lakes, data warehouses, and ETL frameworks across cloud and on-prem platforms. ● Lead the modernization of our data infrastructure and migrate legacy systems to scalable cloud-native solutions. ● Collaborate with analysts, developers, and campaign strategists to ensure reliable data availability and quality. ● Drive implementation of best practices for data governance, access control, observability, and documentation. ● Guide and mentor junior data engineers and help foster a culture of excellence and innovation. ● Evaluate and implement cutting-edge tools and technologies to improve system performance and efficiency. ● Own and ensure end-to-end data reliability, availability, and scalability. Key Requirements: ● 5+ years of experience in Data Engineering with a proven track record of building production-grade data systems. ● Deep expertise in Python and SQL (advanced level). ● Strong experience with Big Data tools such as Airflow, Hadoop, Spark, Hive, Presto, etc. ● Hands-on experience with data lake and warehouse architectures (e.g., Delta Lake, Snowflake, BigQuery, Redshift). ● Proven experience with ETL/ELT design, data modeling (star/snowflake schema), and orchestration tools. ● Proficient in working with cloud platforms like AWS, GCP, or Azure. ● Solid understanding of CI/CD pipelines, Git workflows, and containerization (Docker). ● Excellent knowledge of data security, privacy regulations, and governance practices. ● Exposure to streaming data architectures and real-time processing. Nice to Have: ● Knowledge of data cataloging tools and metadata management. ● Familiarity with BI tools like Tableau, Power BI, or Looker. ● Experience in working within political, social sector, or campaign data environments. ● Prior team lead experience in an agile environment. Soft Skills: ● Strong problem-solving and decision-making capabilities. ● Excellent communication and stakeholder management skills. ● Ability to work in a fast-paced, mission-driven environment. ● Ownership mindset and ability to manage multiple projects with minimal supervision. Show more Show less

Posted 1 week ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description R3 Senior Manager – Data and Analytics Architect The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centre’s focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a highly motivated and hands-on Data & Analytics Architect to join our Strategy & Architecture team within CDNA. This mid-level role will play a critical part in designing scalable, reusable, and secure data and analytics solutions across the enterprise. You will work under the guidance of a senior architect and be directly involved in the implementation of architectural patterns, reference solutions, and technical best practices. This is a highly technical role, ideal for someone who enjoys problem-solving, building frameworks, and working in a fast-paced, collaborative environment. What Will You Do In This Role Partner with senior architects to define and implement modern data architecture patterns and reusable frameworks. Design and develop reference implementations for ingestion, transformation, governance, and analytics using tools such as Databricks (must-have), Informatica, AWS Glue, S3, Redshift, and DBT. Contribute to the development of a consistent and governed semantic layer, ensuring alignment in business logic, definitions, and metrics across the enterprise. Work closely with product line teams to ensure architectural compliance, scalability, and interoperability. Build and optimize batch and real-time data pipelines, applying best practices in data modeling, transformation, and metadata management. Contribute to architecture governance processes, participate in design reviews, and document architectural decisions. Support mentoring of junior engineers and help foster a strong technical culture within the India-based team. What Should You Have Bachelor’s degree in information technology, Computer Science or any Technology stream. 5–8 years of experience in data architecture, data engineering, or analytics solution delivery. Proven hands-on experience with Databricks (must), Informatica, AWS data ecosystem (S3, Glue, Redshift, etc.), and DBT. Solid understanding of semantic layer design, including canonical data models and standardized metric logic for enterprise reporting and analytics. Proficient in SQL, Python, or Scala. Strong grasp of data modeling techniques (relational, dimensional, NoSQL), ETL/ELT design, and streaming data frameworks. Knowledge of data governance, data security, lineage, and compliance best practices. Strong collaboration and communication skills across global and distributed teams. Experience with Dataiku or similar data science/analytics platforms is a plus. Exposure to AI/ML and GenAI use cases is advantageous. Background in pharmaceutical, healthcare, or life sciences industries is preferred. Familiarity with API design, data services, and event-driven architecture is beneficial. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 06/15/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R341138 Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 06/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335067 Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Key Responsibilities: Cloud-Based Development: Design, develop, and deploy scalable solutions using AWS services such as S3, Kinesis, Lambda, Redshift, DynamoDB, Glue, and SageMaker. Data Processing & Pipelines: Implement efficient data pipelines and optimize data processing using pandas, Spark, and PySpark. Machine Learning Operations (MLOps): Work with model training, model registry, model deployment, and monitoring using AWS SageMaker and related services. Infrastructure-as-Code (IaC): Develop and manage AWS infrastructure using AWS CDK and CloudFormation to enable automated deployments. CI/CD Automation: Set up and maintain CI/CD pipelines using GitHub, AWS CodePipeline, and CodeBuild for streamlined development workflows. Logging & Monitoring: Implement robust monitoring and logging solutions using Splunk, DataDog, and AWS CloudWatch to ensure system performance and reliability. Code Optimization & Best Practices: Write high-quality, scalable, and maintainable Python code while adhering to software engineering best practices. Collaboration & Mentorship: Work closely with cross-functional teams, providing technical guidance and mentorship to junior developers. Qualifications & Requirements 7+ years of experience in software development with a strong focus on Python. Expertise in AWS services, including S3, Kinesis, Lambda, Redshift, DynamoDB, Glue, and SageMaker. Proficiency in Infrastructure-as-Code (IaC) tools like AWS CDK and CloudFormation. Experience with data processing frameworks such as pandas, Spark, and PySpark. Understanding of machine learning concepts, including model training, deployment, and monitoring. Hands-on experience with CI/CD tools such as GitHub, CodePipeline, and CodeBuild. Proficiency in monitoring and logging tools like Splunk and DataDog. Strong problem-solving skills, analytical thinking, and the ability to work in a fast-paced, collaborative environment. Preferred Skills & Certifications AWS Certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer, AWS Certified Machine Learning). Experience with containerization (Docker, Kubernetes) and serverless architectures. Familiarity with big data technologies such as Apache Kafka, Hadoop, or AWS EMR. Strong understanding of distributed computing and scalable architectures. Skills Python,MLOps, AWS Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Key Responsibilities: Administer and maintain multi-tenant MySQL databases hosted on AWS Cloud . Review database-related pull requests from developers for schema modifications and updates. Monitor, analyze, and optimize SQL queries and indexes to enhance database performance. Partner with DevOps and development teams to support application needs and deployment workflows. Ensure the implementation and regular testing of database security , backups , and disaster recovery protocols. Design, implement, and maintain data warehousing solutions using AWS Redshift , QuickSight , or similar technologies. Enable reporting and analytics capabilities by integrating data pipelines and supporting BI/reporting use cases. Communicate effectively, both verbally and in writing, with diverse, globally distributed teams. Requirements: Demonstrated experience ( 10+ years ) as a MySQL Database Administrator in cloud environments , with a strong preference for AWS . Hands-on expertise in query optimization , schema design , and performance tuning . Proficiency with AWS RDS , Aurora , CloudWatch , and integration with CI/CD pipelines . Experience designing and managing data warehousing solutions , preferably with AWS Redshift , QuickSight , or other BI/reporting tools. Solid understanding of multi-tenant architectures and database scalability principles. Excellent English communication skills , both written and verbal. Experience collaborating with remote teams across multiple time zones and continents . Show more Show less

Posted 1 week ago

Apply

9.0 - 14.0 years

20 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

My profile :- linkedin.com/in/yashsharma1608 Hiring manager profile :- on payroll of - https://www.nyxtech.in/ Clinet : Brillio PAYROLL AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience 9 + yrs Notice period Immediate joiner Location Any Brillio location (Preferred is Bangalore) Budget – 30 LPA Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management

Posted 1 week ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

FanCode is India’s premier sports destination committed to giving fans a highly personalised experience across content and merchandise for a wide variety of sports. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode has over 100 million users. It has partnered with domestic, international sports leagues and associations across multiple sports. In content, FanCode offers interactive live streaming for sports with industry-first subscription formats like Match, Bundle and Tour Passes, along with monthly and annual subscriptions at affordable prices. Through FanCode Shop, it also offers fans a wide range of sports merchandise for sporting teams, brands and leagues across the world. Technology @ FanCode We have one mission, Create a platform for all sports fans. Built by sports fans for sports fans. We’re at the beginning of our story and growing at an incredible pace. Our tech stack is hosted on AWS and is built on Cloudfront / AWS API Gateway, NGINX, Node.js / Java, Redis / ElastiCache and MySQL / Cassandra as our end to end stack. Besides these, we heavily use Kafka, Spark, Redshift and other cutting edge tech to keep improving FanCode's performance. As a data-driven team, we also use R, Python and other big data technologies for Machine Learning and Predictive Analytics. SDET @ FanCode As a key member of the Test and Release Engineering team, Driving both quality excellence and delivery velocity is important , you will work on building and optimizing the testing processes through automation and tooling. You will be partnering with the dev teams to understand challenges in testing systems and come up with strategies to overcome those challenges to release faster with fewer bugs.You 'll be working closely with platform teams at FanCode to set up tools to not only ensure that releases are fast and stable but also that the required performance criteria is met. Responsibilities Design and deliver automaton and automation framework for products with a quality mindset Provide oversight through code and design reviews for features delivered by other developers within their scrum teams Participate in Test Design Review Board for integrating test plans within and across teams with focus on functional, business, complex scenarios, high impact projects, usability, and accessibility Organise and drive cross product testing Evangelist for Quality and Testing Innovation and Efficiency Monitor product and/or feature-level quality health metrics (testability, test health, test coverage, etc) Work closely with Senior developers, PMs, and UX Designers to ensure their features are delivered to meet business and quality requirements Troubleshoot production issues Provide coaching to leadership and others in your teams, as well as mentor team members. Larger participation in the hiring process: breadth of competencies, bringing others upto speed through shadows Must Haves: 7+ years of strong background in web application/ REST API automation testing and ability to do design and do code reviews for the work done by the Design & Architecture team Ability to write performant code and design automation frameworks and define automation strategy (functional & API) for a product Ability to plan projects (scoping, risk mitigation, dependency management, prioritisation, estimation, success criteria, quality metrics) keeping emphasis on automation with objectives. Ability to design and develop tools to aid testing Coding Good to Have Ability to define the Testing/Automation strategy in the CI/CD architecture Experience in Cloud based infrastructure like AWS, GCP. Have gaming background and have good knowledge in any of the sports like Cricket, Kabaddi, Hockey or football Worked in a growing start-up environment Dream Sports is India’s leading sports technology company with 250 million users, housing brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , India’s digital sports destination, and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 ‘Sportans’. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports’ vision is to ‘Make Sports Better’ for fans through the confluence of sports and technology. For more information: https://dreamsports.group/ Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire AWS Data Pipeline Professionals in the following areas : Experience 3-5 Years Job Description Design, develop, and implement cloud solutions on AWS, utilizing a wide range of AWS services, including Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR. Demonstrate expertise in implementing AWS core services, such as EC2, RDS, VPC, ELB, EBS, Route 53, ELB, S3, Dynamo DB, and CloudWatch. Leverage strong Python and PySpark data engineering capabilities to analyze business requirements, translate them into technical solutions, and successful execution. expertise in AWS Data and Analytics Stack, including Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR for containerization. In-depth knowledge of AWS core services, such as EC2, RDS, VPC, ELB, EBS, Route 53, ELB, S3, Dynamo DB, and CloudWatch. Develop HLDs, LLDs, test plans, and execution plans for cloud solution implementations, including Work Breakdown Structures (WBS). Interact with customers to understand cloud service requirements, transform requirements into workable solutions, and build and test those solutions. Manage multiple cloud solution projects, demonstrating technical ownership and accountability. Capture and share best-practice knowledge within the AWS solutions architect community. Serve as a technical liaison between customers, service engineering teams, and support. Possess a strong understanding of cloud and infrastructure components, including servers, storage, networks, data, and applications, to deliver end-to-end cloud infrastructure architectures and designs. Effectively collaborate with team members from around the globe Excellent analytical and problem-solving skills. Strong communication and presentation skills. Ability to work independently and as part of a team. Experience working with onshore - offshore teams. Required Behavioral Competencies Accountability : Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Participates in team activities and reaches out to others in team to achieve common goals. Agility: Demonstrates a willingness to accept and embrace differing ideas or perceptions which are beneficial to the organization. Customer Focus: Displays awareness of customers stated needs and gives priority to meeting and exceeding customer expectations at or above expected quality within stipulated time. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets Certifications Good To Have At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

FanCode is India’s premier sports destination committed to giving fans a highly personalised experience across content and merchandise for a wide variety of sports. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode has over 100 million users. It has partnered with domestic, international sports leagues and associations across multiple sports. In content, FanCode offers interactive live streaming for sports with industry-first subscription formats like Match, Bundle and Tour Passes, along with monthly and annual subscriptions at affordable prices. Through FanCode Shop, it also offers fans a wide range of sports merchandise for sporting teams, brands and leagues across the world. Technology @ FanCode We have one mission: Create a platform for all sports fans. Built by sports fans for sports fans we cover Sports Live Video Streaming, Live Scores & Commentary, Video On Demand, Player Analytics, Fantasy Research, News and very recently e-Commerce. We’re at the beginning of our story and growing at an incredible pace. Our tech stack is hosted on AWS and is built on Amazon EC2, Cloudfront, Lambda, API Gateway. We have a micro-services based architecture based on Java, Node.js , Python, PHP, Redis, MySQL, Cassandra and Elasticsearch as our end to end stack serving product features. As a data-driven team, we also use Python and other big data technologies for Machine Learning and Predictive Analytics. Besides these, we heavily use Kafka, Spark, Redshift and other cutting edge tech to keep improving FanCode's performance. Quality Assurance at FanCode The test engineering team’s responsibility is to help engineering teams move fast without breaking often. Reducing testing bottlenecks through test automation, setting up feedback loops to learn about defects in production and discovering new ways of reducing them are some of the things that the test engineering team does. As part of the Test Engineering team you will be responsible for maintaining the software excellence and streamline testing processes. You will be playing a key role in balancing speed with stability, ensuring our software meets high-quality standards while maintaining rapid delivery cycles. Your expertise will help shape the quality assurance practices and contribute to delivering exceptional user experiences. If you have a good programming experience and are passionate about software testing, continuous delivery and solving problems with a very customer focussed team, we’d love to hear from you. Your Role: Design, develop, and execute comprehensive test plans, test cases, and test scenarios Perform thorough manual testing including functional, regression, integration, and user acceptance testing Create and maintain automated test scripts using industry-standard testing frameworks Identify, document, and track software defects using bug tracking systems Work closely with developers to resolve issues and validate fixes Participate in code reviews and provide feedback on testability Analyze test results to ensure existing functionality and evaluate quality metrics Support continuous integration/continuous deployment (CI/CD) processes Document testing procedures and maintain test documentation Mentor junior QA team members and share best practices Must Have: Bachelor's degree in Computer Science, Software Engineering, or related field 3+ years of experience in software quality assurance Strong knowledge of software QA methodologies, tools, and processes Experience with test automation frameworks and scripting languages (e.g., Selenium, TestNG, JUnit) Proficiency in at least one programming language (e.g., Java, Python, JavaScript) Experience with bug tracking and test management tools (e.g., JIRA, TestRail) Strong understanding of Agile development methodologies Excellent analytical and problem-solving skills Strong written and verbal communication abilities Good to have: ISTQB certification or similar QA certifications Experience with performance testing tools (e.g., Gatling, Locust) Knowledge of API testing and tools like Postman or SoapUI Experience with mobile application testing Familiarity with version control systems (e.g., Git) Experience with continuous integration tools (e.g., Jenkins, Github actions) Dream Sports is India’s leading sports technology company with brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , a premier digital sports platform that personalizes content and commerce for all sports fans, DreamSetGo , a sports experiences platform, and DreamPay , a payment solutions provider. It has founded the Dream Sports Foundation to help and champion sportspeople and is an active member of the Federation of Indian Fantasy Sports , the nodal body for the Fantasy Sports industry in India. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports is always working on its mission to ‘Make Sports Better’ and is located in Mumbai. Dream Sports has been featured as a ‘Great Places to Work’ by the Great Place to Work’ Institute for four consecutive years. It is also the only sports tech company among India’s best companies to work for in 2021. For more information: https://dreamsports.group/ About FanCode: FanCode is India’s premier digital sports destination committed to giving all fans a highly personalized experience across Content, Community, and Commerce. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode offers interactive live streaming, sports fan merchandise (FanCode Shop), fast interactive live match scores, in-depth live commentary, fantasy sports data and statistics (Fantasy Research Hub), expert fantasy tips, sports news and much more. FanCode has partnered with both domestic and international sports leagues and associations across multiple sports such as three of the top American Leagues - MLB, NFL, and NBA, FIVB, West Indies Cricket Board, Bangladesh Premier League, Caribbean Premier League, Bundesliga, and I-League. Dream Sports India’s leading Sports Technology company is the parent company of FanCode with brands such as Dream11 also in its portfolio. FanCode has already amassed over 3 crore+ app installs and won the “Best Sports Startup” award at FICCI India Sports Awards 2019. Get the FanCode App: iOS | Android Website: www.fancode.com FanCode Shop : www.shop.fancode.com Dream Sports is India’s leading sports technology company with 250 million users, housing brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , India’s digital sports destination, and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 ‘Sportans’. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports’ vision is to ‘Make Sports Better’ for fans through the confluence of sports and technology. For more information: https://dreamsports.group/ Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Traya Health: Traya is an Indian direct-to-consumer hair care brand platform providing a holistic treatment for consumers dealing with hair loss. The Company provides personalised consultations that help determine the root cause of hair fall among individuals, along with a range of hair care products that are curated from a combination of Ayurveda, Allopathy, and Nutrition. Traya's secret lies in the power of diagnosis. Our unique platform diagnoses the patient’s hair & health history, to identify the root cause behind hair fall and delivers customized hair kits to them right at their doorstep. We have a strong adherence system in place via medically-trained hair coaches and proprietary tech, where we guide the customer across their hair growth journey, and help them stay on track. Traya is founded by Saloni Anand, a techie-turned-marketeer and Altaf Saiyed, a Stanford Business School alumnus. Our Vision: Traya was created with a global vision to create awareness around hair loss, de-stigmatise it while empathising with the customers that it has an emotional and psychological impact. Most importantly, to combine 3 different sciences (Ayurveda, Allopathy and Nutrition) to create the perfect holistic solution for hair loss patients. Role Overview: As a Senior Data Engineer, you will architect, build, and maintain our data infrastructure that powers critical business decisions. You will work closely with data scientists, analysts, and product teams to design and implement scalable solutions for data processing, storage, and retrieval. Your work will directly impact our ability to leverage data for business intelligence, machine learning initiatives, and customer insights. Key Responsibilities: ● Design, build, and maintain our end-to-end data infrastructure on AWS and GCP cloud platforms ● Develop and optimize ETL/ELT pipelines to process large volumes of data from multiple sources ● Build and support data pipelines for reporting, analytics, and machine learning applications ● Implement and manage streaming data solutions using Kafka and other technologies ● Design and optimize database schemas and data models in ClickHouse and other databases ● Develop and maintain data workflows using Apache Airflow and similar orchestration tools ● Write efficient, maintainable, and scalable code using PySpark and other data processing frameworks ● Collaborate with data scientists to implement ML infrastructure for model training and deployment ● Ensure data quality, reliability, and security across all data platforms ● Monitor data pipelines and implement proactive alerting systems ● Troubleshoot and resolve data infrastructure issues ● Document data flows, architectures, and processes ● Mentor junior data engineers and contribute to establishing best practices ● Stay current with industry trends and emerging technologies in data engineering Qualifications Required ● Bachelor's degree in Computer Science, Engineering, or related technical field (Master's preferred) ● 5+ years of experience in data engineering roles ● Strong expertise in AWS and/or GCP cloud platforms and services ● Proficiency in building data pipelines using modern ETL/ELT tools and frameworks ● Experience with stream processing technologies such as Kafka ● Hands-on experience with ClickHouse or similar analytical databases ● Strong programming skills in Python and experience with PySpark ● Experience with workflow orchestration tools like Apache Airflow ● Solid understanding of data modeling, data warehousing concepts, and dimensional modeling ● Knowledge of SQL and NoSQL databases ● Strong problem-solving skills and attention to detail ● Excellent communication skills and ability to work in cross-functional teams Preferred ● Experience in D2C, e-commerce, or retail industries ● Knowledge of data visualization tools (Tableau, Looker, Power BI) ● Experience with real-time analytics solutions ● Familiarity with CI/CD practices for data pipelines ● Experience with containerization technologies (Docker, Kubernetes) ● Understanding of data governance and compliance requirements ● Experience with MLOps or ML engineering Technologies ● Cloud Platforms: AWS (S3, Redshift, EMR, Lambda), GCP (BigQuery, Dataflow, Dataproc) ● Data Processing: Apache Spark, PySpark, Python, SQL ● Streaming: Apache Kafka, Kinesis ● Data Storage: ClickHouse, S3, BigQuery, PostgreSQL, MongoDB ● Orchestration: Apache Airflow ● Version Control: Git ● Containerization: Docker, Kubernetes (optional) What We Offer ● Competitive salary and comprehensive benefits package ● Opportunity to work with cutting-edge data technologies ● Professional development and learning opportunities ● Modern office in Mumbai with great amenities ● Collaborative and innovation-driven culture ● Opportunity to make a significant impact on company growth Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,cohort analysis,google sheets,time series analysis,data modeling,predictive analysis,business intelligence tools,power bi,advanced analytics,excel,scenario analysis Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mangaluru, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gulbarga, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Davangere Taluka, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gulbarga, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Davangere Taluka, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mangaluru, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Belgaum, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Belgaum, Karnataka, India

On-site

Linkedin logo

About In the Analytics & AI industry, work with a strategic technology partner delivering reliable AI solutions. Drive client satisfaction and high-quality outcomes through experience and clarity. Key Responsibilities Perform advanced analytics like cohort, scenario, time series, and predictive analysis. Articulate assumptions, analyses, and interpretations of data in various modes. Design data models connecting data elements from different sources. Collaborate with BI engineers to develop scalable reporting and analytics solutions. Query data from warehouses using SQL. Validate and QA data to ensure consistent accuracy and quality. Troubleshoot data issues and conduct root cause analysis. Ideal Profile 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering. Expert level skills in writing complex SQL queries for warehouses. Advanced skills in BI tools like Tableau, Domo, and Looker. Intermediate skills in Excel, Google Sheets, or Power BI. Bachelor’s/Advanced degree in Data Analytics, Computer Science, or related fields. Willingness to work with team members in different time zones. Nice to Have Experience with globally recognized organizations. Hands-on experience with live projects in the SDLC. Skills: Data Analytics,BI Analytics,BI Engineering,SQL,Snowflake,Redshift,SQL Server,Oracle,BigQuery,Tableau,Domo,Looker,Excel,Google Sheets,Power BI Show more Show less

Posted 1 week ago

Apply

Exploring Redshift Jobs in India

The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.

Career Path

In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect

Related Skills

Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming

Interview Questions

  • What is Amazon Redshift and how does it differ from traditional databases? (basic)
  • How does data distribution work in Amazon Redshift? (medium)
  • Explain the difference between SORTKEY and DISTKEY in Redshift. (medium)
  • How do you optimize query performance in Amazon Redshift? (advanced)
  • What is the COPY command in Redshift used for? (basic)
  • How do you handle large data sets in Redshift? (medium)
  • Explain the concept of Redshift Spectrum. (advanced)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you monitor and manage Redshift clusters? (advanced)
  • Can you describe the architecture of Amazon Redshift? (medium)
  • What are the best practices for data loading in Redshift? (medium)
  • How do you handle concurrency in Redshift? (advanced)
  • Explain the concept of vacuuming in Redshift. (basic)
  • What are Redshift's limitations and how do you work around them? (advanced)
  • How do you scale Redshift clusters for performance? (medium)
  • What are the different node types available in Amazon Redshift? (basic)
  • How do you secure data in Amazon Redshift? (medium)
  • Explain the concept of Redshift Workload Management (WLM). (advanced)
  • What are the benefits of using Redshift over traditional data warehouses? (basic)
  • How do you optimize storage in Amazon Redshift? (medium)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you troubleshoot performance issues in Amazon Redshift? (advanced)
  • Can you explain the concept of columnar storage in Redshift? (basic)
  • How do you automate tasks in Redshift? (medium)
  • What are the different types of Redshift nodes and their use cases? (basic)

Conclusion

As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies