Home
Jobs

1759 Redshift Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About The Role We are seeking a highly experienced Principal Presales Architect with deep expertise in AWS cloud services to lead strategic engagements with enterprise customers. This role is at the intersection of technology leadership and customer engagement, requiring a deep understanding of IaaS, PaaS, SaaS , and data platform services , with a focus on delivering business value through cloud adoption and digital transformation. You will be a key contributor to the sales and solutioning lifecycle, working alongside business development, account executives, product, and engineering teams. This role also involves driving cloud-native architectures , conducting deep technical workshops, and influencing executive stakeholders. Key Responsibilities Presales & Customer Engagement Act as the technical lead in strategic sales opportunities, supporting cloud transformation deals across verticals. Design and present end-to-end cloud solutions tailored to client needs, with a focus on AWS architectures (compute, networking, storage, databases, analytics, security, and DevOps). Deliver technical presentations, POCs, and solution workshops to executive and technical stakeholders. Collaborate with sales teams to develop proposals, RFP responses, solution roadmaps , and TCO/ROI analysis . Drive early-stage discovery sessions to identify business objectives, technical requirements, and success metrics. Own the solution blueprint and ensure alignment across technical, business, and operational teams. Architecture & Technology Leadership Architect scalable, secure, and cost-effective solutions using AWS services including EC2, Lambda, S3, RDS, Redshift, EKS, and others. Lead design of data platforms and AI/ML pipelines , leveraging AWS services like Redshift, SageMaker, Glue, Athena, EMR , and integrating with 3rd party tools when needed. Evaluate and recommend multi-cloud integration strategies (Azure/GCP experience is a strong plus). Guide customers on cloud migration, modernization, DevOps, and CI/CD pipelines . Collaborate with product and delivery teams to align proposed solutions with delivery capabilities and innovations. Stay current with industry trends, emerging technologies , and AWS service releases , integrating new capabilities into customer solutions. Required Skills & Qualifications Technical Expertise 15+ years in enterprise IT or architecture roles, with 10+ years in cloud solutioning/presales , primarily focused on AWS. In-depth knowledge of AWS IaaS/PaaS/SaaS , including services across compute, storage, networking, databases, security, AI/ML, and observability. Hands-on experience in architecting and deploying data lake/data warehouse solutions using Redshift , Glue, Lake Formation, and other data ecosystem components. Proficiency in designing AI/ML solutions using SageMaker , Bedrock, TensorFlow, PyTorch, or equivalent frameworks. Understanding of multi-cloud architectures and hybrid cloud solutions; hands-on experience with Azure or GCP is an advantage. Strong command of solution architecture best practices , cost optimization , cloud security , and compliance frameworks. Presales & Consulting Skills Proven success in technical sales roles involving complex cloud solutions and data platforms . Strong ability to influence C-level executives and technical stakeholders . Excellent communication, presentation, and storytelling skills to articulate complex technical solutions in business terms. Experience with proposal development, RFx responses, and pricing strategy . Strong analytical and problem-solving capabilities with a customer-first mindset. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking a highly experienced and customer-focused Presales Architect to join our Solution Engineering team. The ideal candidate will have a strong background in AWS IaaS, PaaS, and SaaS services , deep expertise in cloud architecture , and solid exposure to data platforms , including Amazon Redshift , AI/ML workloads , and modern data architectures . Familiarity with Azure and Google Cloud Platform (GCP) is a strong advantage. This role is a strategic blend of technical solutioning , customer engagement , and sales support , playing a critical role in the pre-sales cycle by understanding customer requirements, designing innovative solutions, and aligning them with the company’s service offerings. Key Responsibilities Pre-Sales and Solutioning: Engage with enterprise customers to understand their technical requirements and business objectives. Architect end-to-end cloud solutions on AWS , covering compute, storage, networking, DevOps, and security. Develop compelling solution proposals, high-level designs, and reference architectures that address customer needs. Support RFI/RFP responses, create technical documentation, and deliver presentations and demos to technical and non-technical audiences. Collaborate with Sales, Delivery, and Product teams to ensure alignment of proposed solutions with client expectations. Conduct technical workshops, proof of concepts (PoCs), and technical validations. Technical Expertise Deep hands-on knowledge and architecture experience with AWS services : IaaS: EC2, VPC, S3, EBS, ELB, Auto Scaling, etc. PaaS: RDS, Lambda, API Gateway, Fargate, DynamoDB, Aurora, Step Functions. SaaS & Security: AWS Organizations, IAM, AWS WAF, CloudTrail, GuardDuty. Understanding of multi-cloud strategies ; exposure to Azure and GCP cloud services including hybrid architectures is a plus. Strong knowledge of DevOps practices and tools like Terraform, CloudFormation, Jenkins, GitOps, etc. Proficiency in architecting solutions that meet scalability , availability , and security requirements. Data Platform & AI/ML Experience in designing data lakes , data pipelines , and analytics platforms on AWS. Hands-on expertise in Amazon Redshift , Athena , Glue , EMR , Kinesis , and S3-based architectures . Familiarity with AI/ML solutions using SageMaker , AWS Comprehend , or other ML frameworks. Understanding of data governance , data cataloging , and security best practices for analytics workloads. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 10+ years of experience in IT, with 5+ years in cloud architecture and pre-sales roles. AWS Certified Solutions Architect – Professional (or equivalent certification) is preferred. Strong presentation skills and experience interacting with CXOs, Architects, and DevOps teams. Ability to translate technical concepts into business value propositions. Excellent communication, proposal writing, and stakeholder management skills. Nice To Have Experience with Azure (e.g., Synapse, AKS, Azure ML) or GCP (e.g., BigQuery, Vertex AI) . Familiarity with industry-specific solutions (e.g., fintech, healthcare, retail cloud transformations). Exposure to AI/ML MLOps pipelines and orchestration tools like Kubeflow , MLflow , or Airflow . Show more Show less

Posted 5 days ago

Apply

2.0 years

0 Lacs

India

Remote

Linkedin logo

We at Arthan are hiring for a partner organization dedicated to empowering people living in poverty to improve their livelihoods, mitigate risks, and promote environmental sustainability. This organization operates at the intersection of social impact and innovation, utilizing actionable and accessible knowledge to accelerate meaningful improvements in human welfare. Overview of the role: As a Senior Software Engineer, you will play a key role in designing, building, and maintaining the technical backbone of our programs. Your primary focus will be on developing services and business logic that power our internal and external platforms while also supporting broader systems and technologies as the organization evolves. Key Responsibilities: Augment, enhance, and refactor legacy code and infrastructure. Architect, design, code, and test new features and functionality. Work with stakeholders to drive the requirements and own the project end-to-end. Analyze and improve the efficiency, scalability, and stability of servers and processes. Think long-term and ensure continued recursive improvement of production systems. Develop and iterate through proofs of concept quickly and efficiently. Identify unnecessary complexity and remove it. Deploy features and applications through DevOps pipelines. Maintain positive system security postures and advise on improvements. Mentor and manage other team members and communicate with the program and research teams. What do you need to be successful in the role? (Must have) 2+ years of experience developing and maintaining communications applications. 7+ years of experience developing software (Python, AWS) 3+ years of experience working with Debian-based Linux systems. 3+ years of experience using Git, GitLab, or similar CI/CD systems. Ability to work with a distributed team across time zones. Excellent communication skills. Self-starter with the ability to work independently and show initiative and judgment in the absence of specific directions. The ability to work effectively in a remote environment with primary communications over email, chat, and video conferencing. Prior experience in managing and mentoring (junior) staff. A learning and growth mindset. What would make you an outstanding candidate? Certifications and experience in Information Security are highly desirable Experience with SQL and PostgreSQL or Amazon Redshift is a plus Experience with Python, Flask, and JavaScript is necessary Experience with Ansible Docker is a plus Knowledge of freeSWITCH is a plus Experience with AWS EC2, Lambda, and S3 - AWS certification is strongly desirable Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Position: L3 AWS Cloud Engineer Experience: 5+ Years Location: Mumbai Employment Type: Full-Time Job Summary We are seeking a highly skilled L3 AWS Cloud Engineer with 5+ years of experience to lead the design, implementation, and optimization of complex AWS cloud architectures. The candidate will have deep expertise in hybrid (Onprem to cloud) networking, AWS connectivity, and advanced AWS services like WAF, Shield, Advanced Shield, EKS, Data Services, and CloudFront CDN, ensuring enterprise-grade solutions. Key Responsibilities Architect and implement hybrid cloud solutions integrating on-premises and AWS environments. Design and manage advanced AWS networking (Direct Connect, Transit Gateway, VPN). Lead deployment and management of Kubernetes clusters using AWS EKS. Implement and optimize security solutions using AWS WAF, Shield, and Advanced Shield. Architect data solutions using AWS Data Services (Redshift, Glue, Athena). Optimize content delivery using AWS CloudFront and advanced CDN configurations. Drive automation of cloud infrastructure using IaC (CloudFormation, Terraform, CDK). Provide leadership in incident response, root cause analysis, and performance optimization. Mentor junior engineers and collaborate with cross-functional teams on cloud strategies. Required Skills and Qualifications 5+ years of experience in cloud engineering, with at least 4 years focused on AWS. Deep expertise in hybrid networking and connectivity (Direct Connect, Transit Gateway, Site-to-Site VPN). Advanced knowledge of AWS EKS for container orchestration and management. Proficiency in AWS security services (WAF, Shield, Advanced Shield, GuardDuty). Hands-on experience with AWS Data Services (Redshift, Glue, Athena,). Expertise in optimizing AWS CloudFront for global content delivery. Strong scripting skills (Python, Bash) and IaC expertise (CloudFormation, Terraform, CDK). Experience with advanced monitoring and analytics (CloudWatch,ELK). Experience with multi-region and multi-account AWS architectures. AWS Certified Solutions Architect – Professional Preferred Skills Knowledge of serverless frameworks and event-driven architectures. Familiarity with machine learning workflows on AWS (SageMaker, ML services). Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Company Description Domax Consulting is a digital services company based in Paris that specializes in supporting businesses in their digital transformation. They offer comprehensive and personalized support from ideation to realization, cutting-edge expertise in development, cybersecurity, and digital marketing, as well as innovative solutions to achieve growth objectives. Key Responsibilities: Conceptualization & Storyboarding: Collaborate with stakeholders (e.g., directors, clients, marketing team) to understand project requirements and creative vision. Develop and present initial concepts, storyboards, and animatics for 3D video projects. 3D Asset Creation: Design and create high-quality 3D models (characters, environments, props) based on provided concepts or self-initiated designs. Develop realistic or stylized textures and materials for 3D assets. Rig 3D models for animation, ensuring smooth and believable movement (if applicable). Animation: Bring 3D models to life through fluid and expressive animation, utilizing techniques such as keyframe animation and potentially motion capture data. Implement principles of animation (timing, spacing, squash and stretch, anticipation, etc.) to enhance realism and impact. Develop camera movements and staging for compelling visual storytelling. Lighting & Rendering: Set up and optimize lighting to create mood, depth, and visual appeal within 3D scenes. Configure and execute high-quality renders of 3D animations. Post-Production & Integration: Perform compositing of 3D elements with other visual assets (live-action footage, 2D graphics) using compositing software. Integrate sound effects and synchronize dialogue as needed. Edit and fine-tune animations to ensure smooth flow and alignment with project goals. Manage and organize 3D asset files and project documentation. Collaboration & Communication: Work closely with other designers, artists, and developers to ensure cohesive project execution. Receive and implement feedback effectively, iterating on visuals until they meet the highest standards. Stay updated on industry trends, techniques, and emerging tools to continuously enhance skills. Required Skills & Qualifications: Education: Bachelor's degree in Animation, Digital Media, Fine Arts, Computer Graphics, or a related field (or equivalent practical experience). Experience: [X] years of professional experience as a 3D Animator, Motion Graphics Designer, or similar role with a strong focus on 3D video creation. Portfolio/Demo Reel: A strong portfolio or demo reel demonstrating exceptional 3D animation skills, storytelling ability, and a keen eye for detail. Software Proficiency (list specific software your prefer/require):3D Animation Software: Autodesk Maya, Blender, 3ds Max, Cinema 4D, Houdini (or similar industry-standard tools). Rendering Engines: V-Ray, Octane, Redshift, Arnold (if specific to your pipeline). Compositing/Post-Production Software: Adobe After Effects, Nuke, DaVinci Resolve. Modeling/Sculpting Software: ZBrush, Substance Painter (if modeling/texturing is a primary responsibility). Video Editing Software: Adobe Premiere Pro, Final Cut Pro (for basic editing tasks). Artistic Skills: Strong understanding of animation principles, visual storytelling, composition, color theory, and perspective. Technical Skills: Knowledge of 3D modeling techniques (polygonal, digital sculpting), rigging principles, UV mapping, texturing, and lighting. Problem-Solving: Ability to troubleshoot technical issues related to animation software, rendering, or asset management. Soft Skills: Exceptional creativity and artistic vision. Strong attention to detail and a commitment to quality. Excellent communication and interpersonal skills for collaboration. Ability to work independently and as part of a team. Strong organizational and time management skills to meet deadlines. Adaptability and willingness to learn new software and techniques. Preferred (but not required) Skills: Experience with motion capture (MOCAP) data. Knowledge of scripting languages (e.g., Python for Maya/Blender). Experience with game engines (e.g., Unity, Unreal Engine) if applicable to your projects. Experience with architectural visualization (Archviz) if relevant. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Summary We are seeking a highly experienced and customer-focused Presales Architect to join our Solution Engineering team. The ideal candidate will have a strong background in AWS IaaS, PaaS, and SaaS services , deep expertise in cloud architecture , and solid exposure to data platforms , including Amazon Redshift , AI/ML workloads , and modern data architectures . Familiarity with Azure and Google Cloud Platform (GCP) is a strong advantage. This role is a strategic blend of technical solutioning , customer engagement , and sales support , playing a critical role in the pre-sales cycle by understanding customer requirements, designing innovative solutions, and aligning them with the company’s service offerings. Key Responsibilities Pre-Sales and Solutioning: Engage with enterprise customers to understand their technical requirements and business objectives. Architect end-to-end cloud solutions on AWS , covering compute, storage, networking, DevOps, and security. Develop compelling solution proposals, high-level designs, and reference architectures that address customer needs. Support RFI/RFP responses, create technical documentation, and deliver presentations and demos to technical and non-technical audiences. Collaborate with Sales, Delivery, and Product teams to ensure alignment of proposed solutions with client expectations. Conduct technical workshops, proof of concepts (PoCs), and technical validations. Technical Expertise Deep hands-on knowledge and architecture experience with AWS services : IaaS: EC2, VPC, S3, EBS, ELB, Auto Scaling, etc. PaaS: RDS, Lambda, API Gateway, Fargate, DynamoDB, Aurora, Step Functions. SaaS & Security: AWS Organizations, IAM, AWS WAF, CloudTrail, GuardDuty. Understanding of multi-cloud strategies ; exposure to Azure and GCP cloud services including hybrid architectures is a plus. Strong knowledge of DevOps practices and tools like Terraform, CloudFormation, Jenkins, GitOps, etc. Proficiency in architecting solutions that meet scalability , availability , and security requirements. Data Platform & AI/ML Experience in designing data lakes , data pipelines , and analytics platforms on AWS. Hands-on expertise in Amazon Redshift , Athena , Glue , EMR , Kinesis , and S3-based architectures . Familiarity with AI/ML solutions using SageMaker , AWS Comprehend , or other ML frameworks. Understanding of data governance , data cataloging , and security best practices for analytics workloads. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 10+ years of experience in IT, with 5+ years in cloud architecture and pre-sales roles. AWS Certified Solutions Architect – Professional (or equivalent certification) is preferred. Strong presentation skills and experience interacting with CXOs, Architects, and DevOps teams. Ability to translate technical concepts into business value propositions. Excellent communication, proposal writing, and stakeholder management skills. Nice To Have Experience with Azure (e.g., Synapse, AKS, Azure ML) or GCP (e.g., BigQuery, Vertex AI) . Familiarity with industry-specific solutions (e.g., fintech, healthcare, retail cloud transformations). Exposure to AI/ML MLOps pipelines and orchestration tools like Kubeflow , MLflow , or Airflow . Show more Show less

Posted 5 days ago

Apply

20.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Linkedin logo

Who is Forcepoint? Forcepoint simplifies security for global businesses and governments. Forcepoint’s all-in-one, truly cloud-native platform makes it easy to adopt Zero Trust and prevent the theft or loss of sensitive data and intellectual property no matter where people are working. 20+ years in business. 2.7k employees. 150 countries. 11k+ customers. 300+ patents. If our mission excites you, you’re in the right place; we want you to bring your own energy to help us create a safer world. All we’re missing is you! Senior Software Engineer – Dashboarding, Reporting & Data Analytics Location: Mumbai (Preferred) Experience: 8-10 years Job Summary We are looking for a Senior Software Engineer with expertise in dashboarding, reporting applications, and data analytics . The ideal candidate should have strong programming skills in Golang and Java , experience working with AWS services like Kinesis, Redshift, and Elasticsearch , and the ability to build scalable, high-performance data pipelines and visualization tools . This role is critical in delivering data-driven insights that help businesses make informed decisions. Key Responsibilities Design, develop, and maintain dashboards and reporting applications for real-time and batch data visualization. Build data pipelines and analytics solutions leveraging AWS services like Kafka, Redshift, Elasticsearch, Glue, and S3. Work with data engineering teams to integrate structured and unstructured data for meaningful insights. Optimize data processing workflows for efficiency and scalability. Develop APIs and backend services using Golang and Java to support reporting and analytics applications. Collaborate with business stakeholders to gather requirements and deliver customized reports and visualizations. Implement data security, governance, and compliance best practices. Conduct code reviews, troubleshooting, and performance tuning. Stay updated with emerging data analytics and cloud technologies to drive innovation. Required Qualifications 8-10 years of experience in software development, data analytics, and dashboarding/reporting applications. Proficiency in Golang and Java for backend development. Strong expertise in AWS data services (Kinesis, Redshift, Elasticsearch, S3, Glue). Experience with data visualization tools (Grafana, Tableau, Looker, or equivalent). Proficiency in SQL and NoSQL databases, with a solid understanding of data modeling and performance optimization. Experience in building and managing scalable data pipelines. Experience in big data processing frameworks (Spark, Flink). Strong problem-solving skills with a focus on efficiency and performance. Excellent communication and collaboration skills. Preferred Qualifications Experience with real-time data streaming and event-driven architectures. Familiarity with big data processing frameworks (Spark, Flink). Experience in CI/CD pipelines and DevOps practices. Don’t meet every single qualification? Studies show people are hesitant to apply if they don’t meet all requirements listed in a job posting. Forcepoint is focused on building an inclusive and diverse workplace – so if there is something slightly different about your previous experience, but it otherwise aligns and you’re excited about this role, we encourage you to apply. You could be a great candidate for this or other roles on our team. The policy of Forcepoint is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to affirmatively seek to advance the principles of equal employment opportunity. Forcepoint is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by sending an email to recruiting@forcepoint.com. Applicants must have the right to work in the location to which you have applied. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

P1-C3-STS Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift, Cloud Formation and other AWS serverless resources Cloud Formation and other AWS serverless resources Show more Show less

Posted 5 days ago

Apply

1.0 years

0 Lacs

India

On-site

Linkedin logo

Our Mission: 6sense is on a mission to revolutionize how B2B organizations create revenue by predicting customers most likely to buy and recommending the best course of action to engage anonymous buying teams. 6sense Revenue AI is the only sales and marketing platform to unlock the ability to create, manage and convert high-quality pipeline to revenue. Our People: People are the heart and soul of 6sense. We serve with passion and purpose. We live by our Being 6sense values of Accountability, Growth Mindset, Integrity, Fun and One Team. Every 6sensor plays a part in defining the future of our industry-leading technology. 6sense is a place where difference-makers roll up their sleeves, take risks, act with integrity, and measure success by the value we create for our customers. We want 6sense to be the best chapter of your career. About the Role: Data Consultant role, within the Professional Services department, is a consultant and analyst role responsible for making strategic data-related decisions by analyzing, manipulating, tracking, managing, and reporting data. This position also requires working with client partners, customer success team and engineering team on data usage and data initiatives. This position is for someone who has solid understanding of data and technology, has experience troubleshooting data issues and implement solutions to mitigate the problems. Additionally, the data consultant is responsible for discovering new potential for existing data and coming up with new recommendations and solutions based on data analysis. The candidate will be primarily supporting the adoption and growth of 6sense platform that our customers use every day. Responsibilities: Provide consultative guidance for data management and governance within the company as well as for clients Maintain a detailed understanding of 6sense product architecture, technical components and application functionalities Understand clients’ data design and architecture and to implement integration strategies with the clients Performing analysis to assess the quality and meaning of data Use of SQL queries for the purpose of analyzing, extracting and transforming of data Preparing reports for the client and internal team stating trends, patterns, and predictions using relevant data Collaborate with stakeholders across departments to address gaps and pain points in a data journey Forge strong relationships with client leadership, analysts, and other stakeholders to grasp their data-related requirements Conduct in-depth analysis of the current trends and products within the data visualization and B.I tool. Designing and maintaining data systems and databases; this includes fixing coding errors and other data-related problems Consolidate analyses and create reports that make business impacts Qualifications: A professional who holds a Bachelor's or Master’s degree in Data Science, Computer Science, Information Systems, Engineering or related fields, along with approximately 1-2 years’ experience as an analyst or data engineer. 2+ years in data management, data analytics, or related roles Should have working experience in one of RDBMS data stores like Oracle, MySQL or Redshift Preferable working Knowledge in Python or Shell script Experience in working with Presto and the Hadoop ecosystem including Hive Strong data mining knowledge of analyzing big data, finding patterns/trends and provide insights to customer Experience in B2B related CRM and Marketing applications like SFDC, Microsoft Dynamics, Eloqua, Marketo Experience in Data visualization tools like Tableau or Power BI Excellent communication skills Proven ability to develop clear and concise written technical and non-technical documentation Strong verbal presentation skills Good to have knowledge in ML, Analytical/Predictive models etc. Our Benefits: Full-time employees can take advantage of health coverage, paid parental leave, generous paid time-off and holidays, quarterly self-care days off, and stock options. We’ll make sure you have the equipment and support you need to work and connect with your teams, at home or in one of our offices. We have a growth mindset culture that is represented in all that we do, from onboarding through to numerous learning and development initiatives including access to our LinkedIn Learning platform. Employee well-being is also top of mind for us. We host quarterly wellness education sessions to encourage self care and personal growth. From wellness days to ERG-hosted events, we celebrate and energize all 6sense employees and their backgrounds. Equal Opportunity Employer: 6sense is an Equal Employment Opportunity and Affirmative Action Employers. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status. If you require reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to jobs@6sense.com. We are aware of recruiting impersonation attempts that are not affiliated with 6sense in any way. All email communications from 6sense will originate from the @6sense.com domain. We will not initially contact you via text message and will never request payments. If you are uncertain whether you have been contacted by an official 6sense employee, reach out to jobs@6sense.com Show more Show less

Posted 5 days ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description Our company is an innovative, global healthcare leader that is committed to improving health and well-being around the world with a diversified portfolio of prescription medicines, vaccines and animal health products. We continue to focus our research on conditions that affect millions of people around the world - diseases like Alzheimer's, diabetes and cancer - while expanding our strengths in areas like vaccines and biologics. Our ability to excel depends on the integrity, knowledge, imagination, skill, diversity and teamwork of an individual like you. To this end, we strive to create an environment of mutual respect, encouragement and teamwork. As part of our global team, you’ll have the opportunity to collaborate with talented and dedicated colleagues while developing and expanding your career. As a Digital Supply Chain Data Modeler/Engineer, you will work as a member of the Digital Manufacturing Division team supporting Enterprise Orchestration Platform. You will be responsible for identifying, assessing, and solving complex business problems related to manufacturing and supply chain. You will receive training to achieve this, and you’ll be amazed at the diversity of opportunities to develop your potential and grow professionally. You will collaborate with business stakeholders and determine analytical capabilities that will enable the creation of Insights-focused solutions that align to business needs and ensure that delivery of these solutions meet quality requirements. The Opportunity Based in Hyderabad, joining a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organization driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Job Description As Data modeler lead, you will be responsible for following Deliver divisional analytics initiatives with primary focus on datamodeling for all analytics, advanced analytics and AI/ML uses cases e,g Self Services , Business Intelligence & Analytics, Data exploration, Data Wrangling etc. Host and lead requirement/process workshop to understand the requirements of datamodeling . Analysis of business requirements and work with architecture team to deliver & contribute to feasibility analysis, implementation plans and high-level estimates. Based on business process and analysis of data sources, deliver detailed ETL design with mapping of data model covering all areas of Data warehousing for all analytics use cases . Creation of data model & transformation mapping in modeling tool and deploy in databases including creation of schedule orchestration jobs . Deployment of Data modeling configuration to Target systems (SIT , UAT & Prod ) . Understanding of Product owership and management. Lead Data model as a product for focus areas of Digital supply chain domain. Creation of required SDLC documentation as per project requirements. Optimization/industrialization of existing database and data transformation solution Prepare and update Data modeling and Data warehousing best practices along with foundational platforms. Work very closely with foundational product teams, Business, vendors and technology support teams to build team to deliver business initiatives . Position Qualifications : Education Minimum Requirement: - B.S. or M.S. in IT, Engineering, Computer Science, or related field. Required Experience and Skills**: 5+ years of relevant work experience, with demonstrated expertise in Data modeling in DWH, Data Mesh or any analytics related implementation; experience in implementing end to end DWH solutions involving creating design of DWH and deploying the solution 3+ years of experience in creating logical & Physical data model in any modeling tool ( SAP Power designer, WhereScape etc ). Experience in creating data modeling standards, best practices and Implementation process. High Proficiency in Information Management, Data Analysis and Reporting Requirement Elicitation Experience working with extracting business rules to develop transformations, data lineage, and dimension data modeling Experience working with validating legacy and developed data model outputs Development experience using WhereScape and various similar ETL/Data Modeling tools Exposure to Qlik or similar BI dashboarding applications Has advanced knowledge of SQL and data transformation practices Has deep understanding of data modelling and preparation of optimal data structures Is able to communicate with business, data transformation team and reporting team Has knowledge of ETL methods, and a willingness to learn ETL technologies Can fluently communicate in English Experience in Redshift or similar databases using DDL, DML, Query optimization, Schema management, Security, etc Experience with Airflow or similar various orchestration tools Exposure to CI/CD tools Exposure to AWS modules such as S3, AWS Console, Glue, Spectrum, etc management Independently support business discussions, analyze, and develop/deliver code Preferred Experience and Skills: Experience working on projects where Agile methodology is leveraged Understanding of data management best practices and data analytics Ability to lead requirements sessions with clients and project teams Strong leadership, verbal and written communication skills with ability to articulate results and issues to internal and client teams Demonstrated experience in the Life Science space Exposure to SAP and Rapid Response domain data is a plus Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Agile Data Warehousing, Agile Methodology, Animal Vaccination, Business, Business Communications, Business Initiatives, Business Intelligence (BI), Computer Science, Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Data Warehousing (DW), Design Applications, Digital Supply Chain, Digital Supply Chain Management, Digital Transformation, Information Management, Information Technology Operations, Software Development, Software Development Life Cycle (SDLC), Supply Chain Optimization, Supply Management, System Designs Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R352794

Posted 5 days ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description: Job Description Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations ͏ Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness ͏ Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation ͏ Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations͏ Mandatory Skills: Amazon Redshift . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less

Posted 5 days ago

Apply

5.0 - 7.0 years

3 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

We’re Hiring: Consultant – Insights & Analytics at Chryselys Location: Hyderabad Department: Insights & Analytics Job Type: Full-time Reports To: Manager About Us: Chryselys is a Pharma Analytics & Business consulting company that delivers data-driven insights leveraging AI-powered, cloud-native platforms to achieve high-impact transformations. We specialize in digital technologies and advanced data science techniques that provide strategic and operational insights. Who we are: People - Our team of industry veterans, advisors and senior strategists have diverse backgrounds and have worked at top tier companies. Quality - Our goal is to deliver the value of a big five consulting company without the big five cost. Technology - Our solutions are Business centric built on cloud native technologies. Role Overview: As a Field Force Operations Consultant at Chryselys, you will leverage your expertise in commercial model design, sales force sizing, territory alignment, and deployment to optimize field force operations and processes. You will work closely with cross-functional teams, including client stakeholders and analytics experts, to define execution KPIs, maximize sales impact, and deliver actionable insights through advanced reporting and dashboards. Your role will also involve segmentation and targeting, incentive compensation processes, and planning for call activities and non-personal promotions. With hands-on experience in tools like Qlik, Power BI, and Tableau, along with technologies such as SQL, you will ensure impactful storytelling and effective stakeholder management while supporting clients across the U.S. and Europe. Key Responsibilities: Capabilities and experience in field force operations and processes related to commercial model design and structure, sales force sizing and optimization, Territory alignment and deployment Good understanding of commercial operations and analytics as a domain Expertise with SF/FF datasets for creating dashboards and reports for multiple user personas Ability to define FF execution and measurement KPIs to maximize sales impact Understanding and expertise in call activity planning, non-personal promotions Good knowledge of segmentation & targeting and incentive compensation processes Hands-on experience with tools like Qlik/Power BI/Tableau and technologies like Python/SQL Stakeholder management abilities and storytelling skills Experience in working with pharma clients across US and Europe What You Bring: Education: Bachelor's or master's degree in data science, statistics, computer science, engineering, or a related field with a strong academic record. Experience: 5-7 years of experience in field force operations, particularly in the pharmaceutical or healthcare industry, working with key datasets Skills: § Strong experience with SQL and cloud-based data processing environments such as AWS (Redshift, Athena, S3) § Demonstrated ability to build data visualizations and communicate insights through tools like PowerBI, Tableau, Qlik, QuickSight, or similar. § Strong analytical skills, with experience in analogue analysis § Ability to manage multiple projects, prioritize tasks, and meet deadlines in a fast-paced environment. § Excellent communication and presentation skills, with the ability to explain complex data science concepts to non-technical stakeholders. § A strong problem-solving mindset, with the ability to adapt and innovate in a dynamic consulting environment. How to Apply: Ready to make an impact? Apply now by clicking [here] or visit our careers page at https://chryselys.com/chryselys-career Please include your resume and a cover letter detailing why you’re the perfect fit for this role. Equal Employment Opportunity: Chryselys is proud to be an Equal Employment Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Connect with Us: Follow us for updates and more opportunities: https://linkedin.com/company/chryselys/mycompany Discover more about our team and culture: http://chryselys.com

Posted 5 days ago

Apply

6.0 years

0 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

6 - 8 Years 1 Opening Trivandrum Role description Job Title: Senior Business Analyst Experience: 6+ Years Location: Trivandrum / Kochi / Bangalore Work Mode: Hybrid (3 days from office) Work Timings: 5 PM to 2 AM IST (to align with US business teams) Job Description: We are looking for a highly skilled Senior Business Analyst with a strong background in requirement gathering, stakeholder management, and technical collaboration. The candidate must be comfortable working closely with US-based clients and managing end-to-end business analysis and solution design. Roles & Responsibilities: Elicit, analyze, and document clear, concise, and detailed business requirements through interaction with business stakeholders. Convert project scope into detailed user stories, wireframes, prototypes, and process flows to support requirement clarification and approvals. Collaborate with IT development teams to ensure proper understanding and implementation of requirements. Coordinate with Quality Assurance teams to define and execute project testing strategies and plans. Ensure business requirements traceability to technical requirements and validate system designs. Lead User Acceptance Testing (UAT) efforts and ensure solutions meet business expectations. Engage with clients and vendors to manage development, communication, and execution activities effectively. Analyze data using SQL and extract insights to define or support business rules and technical solutions. Document all business rules and technical mappings using tools like JIRA and Confluence. Mandatory Skills: Minimum 6 years of IT experience with 4+ years in business analysis and requirement gathering. Minimum 5 years of experience working directly with development teams. Strong experience in writing SQL queries and data analysis. Expertise in using JIRA, Confluence, and Agile methodologies. Strong communication, consulting, and interpersonal skills with senior stakeholder engagement. Proven experience in managing client/vendor relationships and leading cross-functional teams. Experience in leading UAT planning and execution. Comfortable working in a US time zone (5 PM – 2 AM IST). Good to Have Skills: Exposure to cloud database technologies like Snowflake, Redshift, Oracle. Working knowledge of AWS or similar cloud platforms. Experience applying Design Thinking principles. Ability to develop and validate hypotheses to support business insights. Experience in delivering under tight deadlines in hybrid working models. Skills Business Analysis,Requirement Gathering,Sql About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 5 days ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai

On-site

GlassDoor logo

Job Description: An FX Artist creates stunning and realistic visual effects for films, television, and other media projects. They are responsible for simulating natural phenomena (e.g., fire, water, smoke, explosions) and creating dynamic, visually compelling elements using software like Houdini, Maya, and After Effects. Working closely with animators, designers, and technical teams, they ensure the effects integrate seamlessly into the project’s creative vision while meeting technical requirements. Strong problem-solving skills, artistic creativity, and a deep understanding of physics and fluid dynamics are essential. Collaborate effectively with creative Head, VFX/DFX supervisors and lead(s) to incorporate feedback and revisions into grooming work. Key Responsibilities: Effect Creation: Design and create natural or stylized effects such as fire, smoke, water, explosions, and environmental phenomena. Simulation Work: Develop simulations for destruction, particle systems, and other procedural effects using tools like Houdini, Maya, or Blender. Integration: Work with Lighting and Compositing teams to integrate effects into live-action footage or CG renders. Cloth and Fur Simulations: Create realistic or stylized cloth, fur, and hair dynamics using tools like nCloth, XGen, Houdini, or Marvelous Designer. Deformation Corrections: Address secondary character motions, such as jiggle, muscle simulation, and skin sliding. Collaboration with Rigging and Animation Teams: Work closely with animators and riggers to ensure character effects enhance performance while maintaining technical accuracy. Required Skills: Proficiency in industry-standard software such as Houdini, Maya, Blender, Nuke, and other simulation tools. Strong understanding of physics-based simulations (fluids, particles, soft/rigid body dynamics). Knowledge of scripting languages (Python, MEL, VEX) for automation and customization. Experience with rendering engines Arnold, Redshift, Mantra, or similar. Strong sense of motion, scale, and detail to create believable effects. Understanding of color, light, and composition to ensure seamless integration. Experience Minimum of 4-6 years of experience in a similar role within a VFX studio or production environment. A portfolio demonstrating expertise in creating high-quality grooms for feature film, episodics, commercials.

Posted 5 days ago

Apply

5.0 years

7 - 15 Lacs

Ahmedabad

On-site

GlassDoor logo

We are accepting applications for experienced Data Engineer with a strong background in data scraping, cleaning, transformation, and automation. The ideal candidate will be responsible for building robust data pipelines, maintaining data integrity, and generating actionable dashboards and reports to support business decision-making. Key Responsibilities: Develop and maintain scripts for scraping data from various sources including APIs, websites, and databases. Perform data cleaning, transformation, and normalization to ensure consistency and usability across all data sets. Design and implement relational and non-relational data tables and frames for scalable data storage and analysis. Build automated data pipelines to ensure timely and accurate data availability. Create and manage interactive dashboards and reports using tools such as Power BI, Tableau, or similar platforms. Write and maintain data automation scripts to streamline ETL (Extract, Transform, Load) processes. Ensure data quality, governance, and compliance with internal and external regulations. Monitor and optimize the performance of data workflows and pipelines. Qualifications & Skills: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Minimum of 5 years of experience in a data engineering or similar role. Proficient in Python (especially for data scraping and automation), and strong hands-on experience with Pandas, NumPy , and other data manipulation libraries. Experience with web scraping tools and techniques (e.g., BeautifulSoup, Scrapy, Selenium). Strong SQL skills and experience working with relational databases (e.g., PostgreSQL, MySQL) and data warehouses (e.g., Redshift, Snowflake, BigQuery). Familiarity with data visualization tools like Power BI, Tableau, or Looker. Knowledge of ETL tools and orchestration frameworks such as Apache Airflow, Luigi, or Prefect . Experience with version control systems like Git and collaborative platforms like Jira or Confluence . Strong understanding of data security, privacy , and governance best practices. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with cloud platforms such as AWS, GCP, or Azure. Familiarity with NoSQL databases like MongoDB, Cassandra, or Elasticsearch. Understanding of CI/CD pipelines and DevOps practices related to data engineering. Job Type: Full-Time (In-Office) Work Days: Monday to Saturday Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person

Posted 5 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

JD for data science: We are seeking an experienced Data Scientist to join our growing analytics and AI team. This role will involve working closely with cross-functional teams to deliver actionable insights, build predictive models, and drive data-driven decision-making across the organization. The ideal candidate is someone who combines strong analytical skills with hands-on experience in statistical modeling, machine learning, and data engineering best practices. Key Responsibilities:  Understand business problems and translate them into data science solutions.  Build, validate, and deploy machine learning models for prediction, classification, clustering, etc.  Perform deep-dive exploratory data analysis and uncover hidden insights.  Work with large, complex datasets from multiple sources; perform data cleaning and preprocessing.  Design and run A/B tests and experiments to validate hypotheses.  Collaborate with data engineers, business analysts, and product managers to drive initiatives from ideation to production.  Present results and insights to non-technical stakeholders in a clear, concise manner.  Contribute to the development of reusable code libraries, templates, and documentation. Required Skills & Qualifications:  Bachelor’s or Master’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field.  3–7 years of hands-on experience in data science, machine learning, or applied statistics.  Proficiency in Python or R, and hands-on experience with libraries such as scikit- learn, pandas, numpy, XGBoost, TensorFlow/PyTorch.  Solid understanding of machine learning algorithms, statistical inference, and data mining techniques.  Strong SQL skills; experience working with large-scale databases (e.g., Snowflake, BigQuery, Redshift).  Experience with data visualization tools like Power BI, Tableau, or Plotly.  Working knowledge of cloud platforms like AWS, Azure, or GCP is preferred.  Familiarity with MLOps tools and model deployment best practices is a plus. Preferred Qualifications:  Exposure to time series analysis, NLP, or deep learning techniques.  Experience working in domains like healthcare, fintech, retail, or supply chain.  Understanding of version control (Git) and Agile development methodologies. Why Join Us:  Opportunity to work on impactful, real-world problems.  Be part of a high-performing and collaborative team.  Exposure to cutting-edge technologies in data and AI.  Career growth and continuous learning environment. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Jaipur

On-site

GlassDoor logo

Data Engineer Role Category: Programming & Design Job Location: Jaipur, Rajasthan on-site Experience Required: 3–6 Years About the Role We are looking for a highly skilled and motivated Data Engineer to join our team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that supports analytics, machine learning, and business intelligence initiatives across the company. Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines to ingest and process data from multiple sources. Build and maintain scalable and reliable data warehouses, data lakes, and data marts. Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver solutions. Ensure data quality, integrity, and security across all data systems. Optimize data pipeline performance and troubleshoot issues in a timely manner. Implement data governance and best practices in data management. Automate data validation, monitoring, and reporting processes. Required Skills and Qualifications Bachelor's or Master’s degree in Computer Science, Engineering, Information Systems, or related field. Proven experience (X+ years) as a Data Engineer or similar role. Strong programming skills in Python, Java, or Scala. Proficiency with SQL and working knowledge of relational databases (e.g., PostgreSQL, MySQL). Hands-on experience with big data technologies (e.g., Spark, Hadoop). Familiarity with cloud platforms such as AWS, GCP, or Azure (e.g., S3, Redshift, BigQuery, Data Factory). Experience with orchestration tools like Airflow or Prefect. Knowledge of data modeling, warehousing, and architecture design principles. Strong problem-solving skills and attention to detail.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Andhra Pradesh

Remote

GlassDoor logo

Primary Skills Data Engineer , JD Responsible to lead the team and at the same time canidate should be able to get his/her hands dirty by writing code or contributing to any of the development lifecyle.AWS ,RedshiftEMR Cloud ETL ToolsS3 We are looking for a Senior Consultant with at least 5 years of experience to join our team. The ideal candidate should have strong leadership skills and be able to lead a team effectively. At the same time, the candidate should also be willing to get their hands dirty by writing code and contributing to the development lifecycle. The primary skills required for this role include expertise in AWS Redshift and AWS Native Services, as well as experience with EMR, cloud ETL tools, and S3. The candidate should have a strong understanding of these tools and be able to utilize them effectively to meet project goals. As a Senior Consultant, the candidate will be responsible for providing guidance and support to the team, as well as ensuring the successful completion of projects. This is a hybrid work mode position, requiring the candidate to work both remotely and in the office as needed. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 5 days ago

Apply

0.0 - 2.0 years

0 Lacs

India

On-site

Linkedin logo

We’re Hiring: Data Engineer About The Job Duration: 12 Months Location: PAN INDIA Timings: Full Time (As per company timings) Notice Period: within 15 days or immediate joiner Experience: 0- 2 years Responsibilities Job Description Design, develop and maintain reliable automated data solutions based on the identification, collection and evaluation of business requirements. Including but not limited to data models, database objects, stored procedures and views. Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components Support and troubleshoot the data environment (including periodically on call) Document technical artifacts for developed solutions Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills Proactive approach to problem solving with effective influencing skills Familiar with Agile practices and methodologies Education And Experience Requirements Four-year degree in Information Systems, Finance / Mathematics, Computer Science or similar 0-2 years of experience in Data Engineering REQUIRED KNOWLEDGE, SKILLS Or ABILITIES Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience a plus Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modelling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Talend, Apache Airflow, Azure Data Factory, and BI tools like Tableau preferred Analyse data models We are looking for a Senior Data Engineer for the Enterprise Data Organization to build and manage data pipeline (Data ingest, data transformation, data distribution, quality rules, data storage etc.) for Azure cloud-based data platform. The candidate will require to possess strong technical, analytical, programming and critical thinking skills. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Job Summary: We are looking for a skilled and motivated Software Engineer with strong experience in data engineering and ETL processes. The ideal candidate should be comfortable working with any object-oriented programming language, possess strong SQL skills, and have hands-on experience with AWS services like S3 and Redshift. Experience in Ruby and working knowledge of Linux are a plus. Key Responsibilities: Design, build, and maintain robust ETL pipelines to handle large volumes of data. Work closely with cross-functional teams to gather data requirements and deliver scalable solutions. Write clean, maintainable, and efficient code using object-oriented programming and SOLID principles. Optimize SQL queries and data models for performance and reliability. Use AWS services (S3, Redshift, etc.) to develop and deploy data solutions. Troubleshoot issues in data pipelines and perform root cause analysis. Collaborate with DevOps/infra teams for deployment, monitoring, and scaling data jobs. Required Skills: 6+ years of experience in Data Engineering. Programming : Proficiency in any object-oriented language (e.g., Java, Python, etc.) Bonus : Experience in Ruby is a big plus. SQL : Moderate to advanced skills in writing complex queries and handling data transformations. AWS : Must have hands-on experience with services like S3 and Redshift . Linux : Familiarity with Linux-based systems is good to have. Preferred Qualifications: Experience working in a data/ETL-focused role. Familiarity with version control systems like Git. Understanding of data warehouse concepts and performance tuning. Show more Show less

Posted 5 days ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary We are looking for a talented and motivated Data Engineer with 3 to 6 years of experience to join our data engineering team. The ideal candidate will have strong SQL skills, hands-on experience with Snowflake, ETL tools like Talend, DBT for transformation workflows, and a solid foundation in AWS cloud services. Key Responsibilities Design, build, and maintain efficient and reliable data pipelines using SQL, Talend, and DBT Develop and optimize complex SQL queries for data extraction and transformation Manage and administer Snowflake data warehouse environments Collaborate with analytics, product, and engineering teams to understand data requirements Implement scalable data solutions on AWS (e.g., S3, Lambda, Glue, Redshift, EC2) Monitor and troubleshoot data workflows and ensure data quality and accuracy Support deployment and version control of data models and transformations Write clear documentation and contribute to best practices Required Skills And Qualifications 3 to 6 years of experience in data engineering or related fields Strong expertise in SQL and performance tuning of queries Hands-on experience with Snowflake (data modeling, security, performance tuning) Proficiency with Talend for ETL development Experience with DBT (Data Build Tool) for transformation workflows Good knowledge of AWS services, especially data-centric services (S3, Lambda, Glue, etc.) Familiarity with Git-based version control and CI/CD practices Strong analytical and problem-solving skills Show more Show less

Posted 5 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Exciting Opportunity at Eloelo: Join the Future of Live Streaming and Social Gaming! Are you ready to be a part of the dynamic world of live streaming and social gaming? Look no further! Eloelo, an innovative Indian platform founded in February 2020 by ex-Flipkart executives Akshay Dubey and Saurabh Pandey, is on the lookout for passionate individuals to join our growing team in Bangalore. About Us: Eloelo stands at the forefront of multi-host video and audio rooms, offering a unique blend of interactive experiences, including chat rooms, PK challenges, audio rooms, and captivating live games like Lucky 7, Tambola, Tol Mol Ke Bol, and Chidiya Udd. Our platform has successfully attracted audiences from all corners of India, providing a space for social connections and immersive gaming. Recent Milestone: In pursuit of excellence, Eloelo has secured a significant milestone by raising $22Mn in the month of October 2023 from a diverse group of investors, including Lumikai, Waterbridge Capital, Courtside Ventures, Griffin Gaming Partners, and other esteemed new and existing contributors. Why Eloelo? Be a part of a team that thrives on creativity and innovation in the live streaming and social gaming space. Rub shoulders with the stars! Eloelo regularly hosts celebrities such as Akash Chopra, Kartik Aryan, Rahul Dua, Urfi Javed, and Kiku Sharda from the Kapil Sharma Show and that's our level of celebrity collaboration. Working with a world class team ,high performance team that constantly pushes boundaries and limits , redefines what is possible Fun and work at the same place with amazing work culture , flexible timings , and vibrant atmosphere We are looking to hire a business analyst to join our growth analytics team. This role sits at the intersection of business strategy, marketing performance, creative experimentation, and customer lifecycle management, with a growing focus on AI-led insights. You’ll drive actionable insights to guide our performance marketing, creative strategy, and lifecycle interventions, while also building scalable analytics foundations for a fast-moving growth team. About the Role: We are looking for a highly skilled and creative Data Scientist to join our growing team and help drive data-informed decisions across our entertainment platforms. You will leverage advanced analytics, machine learning, and predictive modeling to unlock insights about our audience, content performance, and product engagement—ultimately shaping the way millions of people experience entertainment. Key Responsibilities: Develop and deploy machine learning models to solve key business problems (e.g., personalization, recommendation systems, churn prediction). Analyze large, complex datasets to uncover trends in content consumption, viewer preferences, and engagement behaviors. Partner with product, marketing, engineering, and content teams to translate data insights into actionable strategies. Design and execute A/B and multivariate experiments to evaluate the impact of new features and campaigns. Build dashboards and visualizations to monitor key metrics and provide stakeholders with self-service analytics tools. Collaborate on the development of audience segmentation, lifetime value modeling, and predictive analytics. Stay current with emerging technologies and industry trends in data science and entertainment. Qualifications: Master’s or PhD in Computer Science, Statistics, Mathematics, Data Science, or related field. 1+ years of experience as a Data Scientist, ideally within media, streaming, gaming, or entertainment tech. Proficiency in programming languages such as Python or R. Strong SQL skills and experience working with large-scale datasets and data warehousing tools (e.g., Snowflake, BigQuery, Redshift). Experience with machine learning libraries/frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Solid understanding of experimental design and statistical analysis techniques. Ability to clearly communicate complex technical findings to non-technical stakeholders. Preferred Qualifications: Experience building recommendation engines, content-ranking algorithms, or personalization models in an entertainment context. Familiarity with user analytics tools such as Mixpanel, Amplitude, or Google Analytics. Prior experience with data pipeline and workflow tools (e.g., Airflow, dbt). Background in natural language processing (NLP), computer vision, or audio analysis is a plus. Why Join Us: Shape the future of how audiences engage with entertainment through data-driven storytelling. Work with cutting-edge technology on high-impact, high-visibility projects. Join a collaborative team in a dynamic and fast-paced environment where creativity meets data science. Show more Show less

Posted 5 days ago

Apply

7.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Candidate Profile Previous experience in building data science / algorithms based products is big advantage. Experience in handling healthcare data is desired. Educational Qualification Bachelors / Masters in computer science / Data Science or related subjects from reputable institution Typical Experience 7-9 years experience of industry experience in developing data science models and solutions. Able to quickly pick up new programming languages, technologies, and frameworks Strong understanding of data structures and algorithms Proven track record of implementing end to end data science modelling projects, providing the guidance and thought leadership to the team. Strong experience in a consulting environment with a do it yourself attitude. Primary Responsibility As a Data science lead you will be responsible to lead a team of analysts and data scientists / engineers and deliver end to end solutions for pharmaceutical clients. Is expected to participate in client proposal discussions with senior stakeholders and provide thought leadership for technical solution. Should be expert in all phases of model development (EDA, Hypothesis, Feature creation, Dimension reduction, Data set clean-up, Training models, Model selection, Validation and Deployment) Should have deep understanding of statistical & machine learning methods ((logistic regression, SVM, decision tree, random forest, neural network), Regression (linear regression, decision tree, random forest, neural network), Classical optimisation (gradient descent etc), Must have thorough mathematical knowledge of correlation/causation, classification, recommenders, probability, stochastic processes, NLP, and how to implement them to a business problem. Should be able to help implement ML models in a optimized , sustainable framework. Expected to gain business understanding in health care domain order to come up with relevant analytics use cases. (E.g. HEOR / RWE / Claims Data Analysis) Expected to keep the team up to date on latest and great in the world on ML and AI. Technical Skill and Expertise Expert level proficiency in programming language Python/SQL. Working knowledge of Relational SQL and NoSQL databases, including Postgres, Redshift Extensive knowledge of predictive & machine learning models in order to lead the team in implementation of such techniques in real world scenarios. Working knowledge of NLP techniques and using BERT transformer models to solve complicated text heavy data structures. Working knowledge of Deep learning & unsupervised learning. Well versed with Data structures, Pre-processing , Feature engineering, Sampling techniques. Good statistical knowledge to be able analyse data. Exposure to open source tools & working on cloud platforms like AWS and Azure and being able to use their tools like Athena, Sagemaker, machine learning libraries is a must. Exposure to AI tools LLM models Llama (ChatGPT, Bard) and prompt engineering is added advantage Exposure to visualization tools like Tableau, PowerBI is an added advantage Don't meet every job requirement? That's okay! Our company is dedicated to building a diverse, inclusive, and authentic workplace. If you're excited about this role, but your experience doesn't perfectly fit every qualification, we encourage you to apply anyway. You may be just the right person for this role or others. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from TATA Consultancy Services!! TCS is hiring for Data Modeler - Architect Experience Range: 10+ Years Job Location: Hyderabad (Adibatla), Chennai Job Summary: Detail-oriented and analytical Data Modeler to design, implement, and maintain logical and physical data models that support business intelligence, data warehousing, and enterprise data integration needs. The ideal candidate will work closely with business analysts, data architects, and software engineers to ensure data is organized effectively and support scalable, high-performance applications. Required Skills: • Strong understanding of relational, dimensional, and NoSQL data modeling techniques. • Proficient in data modeling tools (e.g., Erwin, Enterprise Architect, PowerDesigner, SQL Developer Data Modeler). • Experience with Advanced SQL and major database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL). • Familiarity with cloud data platforms (e.g., AWS Redshift, Google BigQuery, Azure SQL, Snowflake). • Excellent communication and documentation skills. • Knowledge of data governance and data quality principles. • Experience with data warehousing concepts and tools (e.g., ETL pipelines, OLAP cubes). • Familiarity with industry standards such as CDM (Common Data Model), FHIR, or other domain-specific models Key Responsibilities: • Design and develop conceptual, logical, and physical data models. • Translate business requirements into data structures that support analytics, reporting, and operational needs. • Work with stakeholders to understand and document data needs and flows. • Optimize and maintain existing data models for performance and scalability. • Ensure data models are consistent with architectural guidelines and standards. • Develop and maintain metadata repositories and data dictionaries. • Collaborate with data architects and engineers to implement models within databases and data platforms. • Assist in data quality analysis and improvement initiatives. • Document data models and data mapping specifications. Regards Bodhisatwa Ray Show more Show less

Posted 5 days ago

Apply

Exploring Redshift Jobs in India

The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.

Career Path

In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect

Related Skills

Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming

Interview Questions

  • What is Amazon Redshift and how does it differ from traditional databases? (basic)
  • How does data distribution work in Amazon Redshift? (medium)
  • Explain the difference between SORTKEY and DISTKEY in Redshift. (medium)
  • How do you optimize query performance in Amazon Redshift? (advanced)
  • What is the COPY command in Redshift used for? (basic)
  • How do you handle large data sets in Redshift? (medium)
  • Explain the concept of Redshift Spectrum. (advanced)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you monitor and manage Redshift clusters? (advanced)
  • Can you describe the architecture of Amazon Redshift? (medium)
  • What are the best practices for data loading in Redshift? (medium)
  • How do you handle concurrency in Redshift? (advanced)
  • Explain the concept of vacuuming in Redshift. (basic)
  • What are Redshift's limitations and how do you work around them? (advanced)
  • How do you scale Redshift clusters for performance? (medium)
  • What are the different node types available in Amazon Redshift? (basic)
  • How do you secure data in Amazon Redshift? (medium)
  • Explain the concept of Redshift Workload Management (WLM). (advanced)
  • What are the benefits of using Redshift over traditional data warehouses? (basic)
  • How do you optimize storage in Amazon Redshift? (medium)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you troubleshoot performance issues in Amazon Redshift? (advanced)
  • Can you explain the concept of columnar storage in Redshift? (basic)
  • How do you automate tasks in Redshift? (medium)
  • What are the different types of Redshift nodes and their use cases? (basic)

Conclusion

As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies