Jobs
Interviews

3678 Redshift Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role - BI Analytics- Sigma Computing Location – Noida Exp – 5+ Years Job Description : - • Design, develop, and maintain dashboards and reports using Sigma Computing. • Collaborate with business stakeholders to understand data requirements and deliver actionable insights. • Write and optimize SQL queries that run directly on cloud data warehouses. • Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. • Apply row-level security and user-level filters to ensure proper data access controls. • Partner with data engineering teams to validate data accuracy and ensure model alignment. • Troubleshoot performance or data issues in reports and dashboards. • Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: • 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. • Hands-on experience with Sigma Computing is highly preferred. • Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). • Experience with data modeling concepts and modern data stacks. • Ability to translate business requirements into technical solutions. • Familiarity with data governance, security, and role-based access controls. • Excellent communication and stakeholder management skills. • Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). • Familiarity with dbt, Fivetran, or other ELT/ETL tools. • Exposure to Agile or Scrum methodologies.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Experience: 3-6 years of hands-on experience in designing and developing conceptual, logical, and physical data models for relational, dimensional, and NoSQL data platforms. Knowledge of Data Vault, NoSQL, Dimensional Modeling, Graph data model, and proficiency in at least one of these. Proven experience with data warehousing, data lakes, and enterprise big data platforms. Knowledge of databases such as columnar databases, vector databases, graph databases, etc. Strong knowledge of metadata management, data modeling, and related tools (e.g., Erwin, ER/Studio). Experience with ETL tools and data ingestion protocols. Familiarity with cloud-based data warehousing solutions (e.g., Google BigQuery , AWS Redshift, Snowflake) and big data technologies (e.g., Hadoop, Spark). Experience in creating comprehensive documentation of data models, data dictionaries, and metadata. Preferred: Experience with cloud modernization projects and modern database technologies. Certification in data modeling or database design. Strong communication and presentation skills. Experience in creating data models that comply with data governance policies and regulatory requirements. Experience leading initiatives to modernize data platforms using cloud-based solutions such as Google BigQuery , AWS Redshift, Snowflake, etc. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302295

Posted 1 week ago

Apply

11.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Governance Tooling & Lifecycle Mgmt. Lead: (Sr Manager, Data Operations & Management) As the Data Governance Tooling & Lifecycle Management Lead, you will be responsible for the end-to-end strategy, implementation, and operations of data governance tooling and processes across the enterprise. This role leads efforts to enable scalable metadata management, data lineage, data lifecycle governance, and access policy enforcement—using modern platforms such as Collibra and supporting a cloud-native data stack spanning GCP, AWS, BigQuery, and Redshift. You will collaborate across data engineering, architecture, compliance, and analytics teams to ensure data is governed, discoverable, and trusted throughout its lifecycle. Who we are looking for: Primary Responsibilities: Governance Tooling Ownership: Own the architecture, implementation, and administration of enterprise data governance platforms (e.g., Collibra). Define and evolve governance workflows, including stewardship assignments, metadata curation, approval processes, and policy enforcement. Integrate governance tooling with cloud platforms, data warehouses, and cataloging solutions to enable real-time governance at scale. Lifecycle Management Strategy: Develop and implement strategies for data lifecycle governance from ingestion and active use through archival and deletion. Ensure that data retention, archival, and purging practices align with compliance regulations and business needs. Partner with cloud and infrastructure teams to operationalize lifecycle rules across GCP, AWS, and warehouse platforms (e.g., BigQuery, Redshift). Metadata & Lineage Enablement: Drive adoption and quality of technical and business metadata, ensuring traceability and data understanding across systems. Lead initiatives to automate and visualize end-to-end data lineage across source systems, pipelines, warehouses, and BI tools. Policy Management & Compliance: Collaborate with legal, compliance, and security teams to define and enforce data access, classification, and privacy policies. Ensure tooling supports regulatory compliance frameworks (e.g., GDPR, CCPA, HIPAA) and internal audit requirements. Collaboration & Enablement: Work with data stewards, engineers, and product teams to ensure governance tooling meets user needs and drives adoption. Support enablement efforts through training, documentation, and tooling best practices. Report on governance adoption, data quality KPIs, and policy coverage to senior leadership and data councils. Skill: 11+ years of experience in data governance, metadata management, or data operations, with 3+ years owning enterprise tooling or lifecycle processes. Deep expertise in: Data governance platforms (e.g., Collibra, Alation, Informatica) Metadata and lineage management Cloud platforms: GCP (BigQuery, Cloud Storage), AWS (Redshift, S3) SQL and enterprise-scale ETL/ELT pipelines Integration with enterprise data platforms, pipelines, and BI tools Strong understanding of compliance and regulatory data handling practices. Excellent project management and stakeholder communication skills across technical and business domains. Bachelor’s or Master’s degree in Data Management, Information Systems, Computer Science, or related field. Preferred Experience: Experience in Retail or QSR environments managing governance across global data operations. Exposure to data product ownership, data mesh, or federated governance models. Familiarity with APIs and automation scripts to extend and integrate governance workflows with data pipelines and CI/CD processes. Current GCP Associates (or Professional) Certification. Work location : Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Experience working on BigID or Collibra. Knowledge of data classification and data products. Understanding of data loss and personal information security. Exposure to Snowflake, S3, Redshift, SharePoint, and Box. Understanding of connecting to various source systems. Deep understanding and practical knowledge of IDEs such as Eclipse, PyCharm, or any Workflow Designer. Experience with one or more of the following languages: Java, JavaScript, Groovy, Python. Deep understanding and hands-on experience of CI/CD processes and tooling such as GitHub. Experience working in DevOps teams based on Kubernetes tools. Hands-on experience in database concepts and a fair idea about data classification, lineage, and storage is a plus. Fantastic written and spoken English, interpersonal skills, and a collaborative approach to delivery. Desirable Skills And Experience Overall IT experience in the range of 8 to 12 years Technical Degree to validate the experience Deep technical expertise Display a solid understanding of the technology requested and problem-solving skills Must be analytical, focused and should be able to independently handle work with minimum supervision Good collaborator management and team player Exposure to platforms like Talend Data Catalog, BigID, or Snowflake is an advantage Basic AWS knowledge is a plus Knowledge and experience of integration technologies like Mulesoft and SnapLogic Excellent Jira skills including the ability to rapidly generate JQL on-the-fly and save JQL queries, filters, views, etc., for publishing to fellow engineers and senior stakeholders Creation of documentation in Confluence Experience of Agile practices, preferably having been part of an Agile team for several years

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Accessibility Engineering Support: Manager, Data Operations & Management As the Manager of Data Accessibility Engineering Support, you will play a critical role in ensuring that enterprise data is secure, discoverable, and accessible for advanced analytics, AI/ML, and operational use. You will oversee the implementation and support of data governance tooling, metadata management, and access controls across cloud-native platforms. This role is hands-on and strategic—ensuring compliance with organizational policies while enabling scalable data accessibility across GCP, AWS, Big Query, Redshift, and other modern data environments. Who we are looking for: Primary Responsibilities: Data Accessibility & Governance Enablement: Lead the implementation and support of data accessibility solutions, ensuring efficient access to governed and trusted data assets. Oversee data governance tools and platforms (e.g., Collibra ) for metadata management, lineage, and policy enforcement. Manage and maintain technical metadata and data cataloging frameworks that support enterprise discoverability. Cloud Platform Integration: Design and implement data accessibility frameworks for GCP and AWS environments, with a strong focus on Big Query, Redshift, and cloud-native storage layers (GCS/S3). Collaborate with cloud engineering and security teams to enforce fine-grained access controls and data classification. AI / ML Support & Lifecycle Management: Partner with AI / ML teams to support model lifecycle management through reliable access to training and scoring datasets. Ensure data quality and accessibility standards are embedded in MLOps workflows and pipelines. Data Quality, Policy & Compliance: Implement and monitor enterprise data quality frameworks to support regulatory compliance and business confidence. Develop strategies for reconciliation, validation, and data forensics to resolve data inconsistencies. Ensure alignment with organizational data usage policies, privacy standards, and auditability requirements. Cross-Functional Collaboration & Support: Work closely with data stewards, data engineers, data scientists, and compliance teams to continuously improve data operations. Provide Tier 2 / 3 support for data accessibility and metadata-related issues. Lead efforts to educate teams on data usage best practices, standards, and governance workflows. Skill: 6 to 10 years of experience in data operations, data governance, or data quality engineering roles. Hands-on experience with: Data governance platforms, especially Collibra Cloud platforms: Google Cloud Platform (GCP), Amazon Web Services (AWS) Data warehouses: Big Query, Redshift (and / or Snowflake) SQL and enterprise-scale ETL / ELT pipelines Metadata management, cataloging, and data lineage tracking AI/ML data workflows and supporting structured / unstructured data access for model training and inferencing Strong analytical and problem-solving skills in large-scale, distributed data environments. Familiarity with data security, privacy regulations, and compliance standards (e.g., GDPR, CCPA). Excellent collaboration and communication skills across technical and non-technical teams. Bachelor’s or master’s degree in data science, Information Systems, Computer Science, or a related field. Preferred Experience: Experience in Retail or QSR environments with complex multi-region data access needs. Exposure to enterprise data catalogs, automated data quality tooling, and access request workflows. Current GCP Associates (or Professional) Certification. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 1 week ago

Apply

5.0 years

4 - 7 Lacs

Hyderābād

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : The WBD Integration team is seeking a Senior Integration Developer who will be responsible for providing technical expertise for supporting and enhancing the Integration suite (Informatica Power Center, IICS). We are specifically looking for a candidate with solid technical skills and experience in integrating ERP applications, SAAS, PAAS platforms such as SAP, Salesforce, Workday, etc., and data warehouses such as Teradata, Snowflake, and RedShift. Experience with the Informatica cloud platform would be ideal for this position. The candidate's primary job functions include but are not limited to the day-to-day configuration/development of the Informatica platform. The candidate must possess strong communication and analytical skills to effectively work with peers within the Enterprise Technology Group, various external partners/vendors, and business users to determine requirements and translate them into technical solutions. The candidate must have the ability to independently complete individual tasks in a dynamic environment to achieve departmental and company goals. Qualifications & Experiences: Leads the daily activities of a work group / team. Typically leads complex professional undertakings or teams. May be assigned to new work groups or teams. Interacts with peers and other internal and external stakeholders regularly. Demonstrates advanced proficiency in full range of skills required to perform the role. Acts as a Mentor Work on POC with new connectors and closely work with Network team. Performing peer review of objects developed by other developers in team as needed. Work with Business and QA Team during various phases of deployment i.e., requirements, QAST, SIT phases Report any Functional gaps in existing Application and suggest business process improvements and support for bug fixes and issues reported Coordinating activities between the different LOB’s/teams Translate conceptual system requirements into technical data and integration requirements Proficient in using Informatica Cloud application and data integrations is a must Proficient in developing custom API’s to handle bulk volumes, pagination etc. Design, develop, and implement integration solutions using Informatica Intelligent Cloud Services. Configure data mappings, transformations, and workflows to ensure data consistency and accuracy. Develop and maintain APIs and connectors to integrate with various data sources and applications. Prepare data flow diagramming and/or process modeling Strong knowledge of integration protocols and technologies (e.g., REST, SOAP, JSON, XML). Perform Unit Testing and debugging of applications to ensure the quality of the delivered requirements and overall health of the system Develop standards and processes to support and facilitate integration projects and initiatives. Educate other team members and govern tool usage Participate in research and make recommendations on the integration products and services Monitor integration processes and proactively identify and resolve performance or data quality issues. Provide ongoing maintenance and support for integration solutions. Perform regular updates and upgrades to keep integrations current. Proficiency in Informatica Intelligent Cloud Services. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Excellent problem-solving and troubleshooting skills. Strong communication and teamwork skills. Qualifications & Experiences: 5+ year's Developer Experience in Informatica IICS Application Integration and Data Integration. Experience in PowerCenter along with IICS and complete knowledge of SDLC Process Experience in API development, including best practices, testing methods and deployment strategies. Experience in designing, creating, refining, deploying, and managing the organization's data architecture including the end-to-end vision for how data will flow from system to system, for multiple applications and across different territory. Expertise in with the tools like SOA, ETL, ERP, XML etc Understanding of Python, AWS Redshift, Snowflake and Relational Databases Knowledge of UNIX Shell scripts and should be able to write/debug Shell Scripts. Ability to work well within an agile team environment and apply related working methods. Able to analyze and understand complex customer scenario's and thrives on difficult challenges Team player, multitasker, excellent communication skills (convey highly technical information into business terms, clear email communications), ability to mentor team members. Preferred Qualifications: Informatica certification in Informatica Intelligent Cloud Services. Experience with other integration tools and middleware. Knowledge of data governance and data quality best practices. Not Required but preferred experience: Public speaking and presentation skills. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 1 week ago

Apply

2.0 years

4 - 8 Lacs

Hyderābād

On-site

ABOUT FLUTTER ENTERTAINMENT Flutter Entertainment is a global leader in sports betting, gaming, and entertainment, with annual revenues of $11.7 Bn and a customer base of over 12 million players (in 2023) driven by a portfolio of iconic brands, including Paddy Power, Betfair, FanDuel, PokerStars, Junglee Games and Sportsbet. Listed on both the New York Stock Exchange (NYSE) and the London Stock Exchange (LSE), Flutter was recently included in TIME's 100 Most Influential Companies of 2024 in the 'Pioneers' section. Our ambition is to transform global gaming and betting to deliver long-term growth and a positive, sustainable future for our sector. Working at Flutter is a chance to work with a growing portfolio of brands across a range of opportunities. We will support you every step of the way to help you grow. Just like our brands, we ensure our people have everything they need to succeed. FLUTTER ENTERTAINMENT INDIA Our Hyderabad office, located in one of India’s premier technology parks is the Global Capability Center for Flutter Entertainment. A center of expertise and innovation, this hub is now home to over 1000+ employees working across Customer Service Operations, Data and Technology, Finance Operations, HR Operations, Procurement Operations, and other key enabling functions. We are committed to crafting impactful solutions for all our brands and divisions to power Flutter's incredible growth and global impact. With the scale of a leader and the mindset of a challenger, we’re dedicated to creating a brighter future for our customers, colleagues, and communities. OVERVIEW OF THE ROLE We are looking for a Data Analyst to join our Data & Analytics (ODA) department in Hyderabad, India . Delivering deep insights and analytics within a fast-paced culture in the world’s largest online gaming company, you will join a team of exceptional data analysts who shape the future of online gaming through detailed analysis for safer gambling, fraud, customer experience, regulatory, overall operations and other departments. Your work will have a tangible impact on our players’ experience and our business’ direction. You shall dive into databases, querying large volumes of behavioural data to create actionable insights and deliver recommendations to department heads and directors. As well as leading in-depth analysis of customer behaviour, you will develop dashboards and executive summaries for various audiences, establish and track key performance indicators and provide ad-hoc analytical support. Your work will bring our extensive data to life, adding insight to key decision-making processes and optimising our systems to keep our site safe, sustainable and where the best play. KEY RESPONSIBILITES Extract data from our databases in various environments ( DB2, MS SQL Server and Azure ) then process and interpret using statistical techniques in Python and Excel Identify patterns and emerging trends with detailed analysis to offer constructive suggestions and estimate the impact their potential impact to customers and business Create presentations that synthesise findings from multiple analyses to inform strategic decision-making of senior leadership Develop interactive dashboards that highlight key metrics and trends in customer behaviour and payment fraud Design infographics that visually communicate complex data and analysis to a non-technical audience, such as regulators or customer support teams Engage with global business stakeholders on key projects, understand how each area of the business works to provide data-driven insights and guidance Help define product roadmaps by identifying opportunities for improvement based on data and analysis. TO EXCEL IN THIS ROLE, YOU WILL NEED TO HAVE 2 to 4 years of relevant work experienc e as a Data Analyst or Data Scientist Bachelor’s degree in a quantitative field such as Science, Mathematics, Economics, Engineering Proficiency in SQL with the ability to create complex queries from scratch Advanced expertise in Microsoft Excel, PowerPoint and Word Experience presenting and reporting analyses to stakeholders Ability to create high-quality data visualizations , local & server-based automation solutions and presentations using PowerPoint and tools such as MicroStrategy, Tableau, or PowerBI Experience with programming (e.g., Python, R etc.) Applied experience with statistical techniques such as hypothesis testing, causal impact analysis, regression analysis, or time series analysis Excellent organisational and communication skills with the ability to manage day-to-day work independently and consistently deliver quality work within deadlines Desired Qualifications Experience with data warehouse technologies (e.g., MS SQL Server Management Studio, Amazon Redshift) is a plus Certifications (MOOCs) on Data Analysis, Python, SQL, ETL, DSA, ML/DL, Data Science, etc. are desirable BENEFITS WE OFFER Access to Learnerbly, Udemy, and a Self-Development Fund for upskilling. Career growth through Internal Mobility Programs. Comprehensive Health Insurance for you and dependents. Well-Being Fund and 24/7 Assistance Program for holistic wellness. Hybrid Model: 2 office days/week with flexible leave policies, including maternity, paternity, and sabbaticals. Free Meals, Cab Allowance, and a Home Office Setup Allowance. Employer PF Contribution, gratuity, Personal Accident & Life Insurance. Sharesave Plan to purchase discounted company shares. Volunteering Leave and Team Events to build connections. Recognition through the Kudos Platform and Referral Rewards. WHY CHOOSE US Flutter is an equal-opportunity employer and values the unique perspectives and experiences that everyone brings. Our message to colleagues and stakeholders is clear: everyone is welcome, and every voice matters. We have ambitious growth plans and goals for the future. Here's an opportunity for you to play a pivotal role in shaping the future of Flutter Entertainment India

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Hyderābād

On-site

Chryselys Overview Chryselys is a Pharma Analytics & Business consulting company that delivers data-driven insights leveraging AI-powered, cloud-native platforms to achieve high-impact transformations. Chryselys was founded in the heart of Silicon Valley in November 2019 with the vision of delivering high-value business consulting, solutions, and services to clients in the healthcare and life sciences space. We are trusted partners for organizations that seek to achieve high-impact transformations and reach their higher-purpose mission. Chryselys India supports our global clients to achieve high-impact transformations and reach their higher-purpose mission. Our India team focuses on development of Commercial Insights platform and supports client projects. Role Summary As a Consultant, you will work closely with internal and external stakeholders and deliver high quality analytics solutions to real-world Pharma commercial organization’s business problems. You will bring deep Pharma / Healthcare domain expertise and use cloud data tools to help solve complex problems Key Responsibilities: Collaborate with internal teams and client stakeholders to deliver Business Intelligence solutions that support key decision-making for the Commercial function of Pharma organizations. Leverage deep domain knowledge of pharmaceutical sales, claims, and secondary data to structure and optimize BI reporting frameworks. Develop, maintain, and optimize interactive dashboards and visualizations using Tableau (primary), along with other BI tools like Power BI or Qlik, to enable data-driven insights. Translate business requirements into effective data visualizations and actionable reporting solutions tailored to end-user needs. Write complex SQL queries and work with large datasets housed in Data Lakes or Data Warehouses to extract, transform, and present data efficiently. Conduct data validation, QA checks, and troubleshoot stakeholder-reported issues by performing root cause analysis and implementing solutions. Collaborate with data engineering teams to define data models, KPIs, and automate data pipelines feeding BI tools. Manage ad-hoc and recurring reporting needs, ensuring accuracy, timeliness, and consistency of data outputs. Drive process improvements in dashboard development, data governance, and reporting workflows. Document dashboard specifications, data definitions, and maintain data dictionaries. Stay up to date with industry trends in BI tools, visualization of best practices and emerging data sources in the healthcare and pharma space. Prioritize and manage multiple BI project requests in a fast-paced, dynamic environment. Qualifications: 2–4 years of experience in BI development, reporting, or data visualization, preferably in the pharmaceutical or life sciences domain. Strong hands-on experience building dashboards using Tableau (preferred), Power BI, and Qlik. Advanced SQL skills for querying and transforming data across complex data models. Familiarity with pharma data such as Sales, Claims, and secondary market data is a strong plus. Experience in data profiling, cleansing, and standardization techniques. Ability to translate business questions into effective visual analytics. Strong communication skills to interact with stakeholders and present data insights clearly. Self-driven, detail-oriented, and comfortable working with minimal supervision in a team-oriented environment. Exposure to data warehousing concepts and cloud data platforms (e.g., Snowflake, Redshift, or BigQuery) is an advantage. Education Bachelor’s or Master’s Degree (computer science, engineering or other technical disciplines)

Posted 1 week ago

Apply

2.0 years

0 Lacs

New Delhi, Delhi, India

On-site

🎨 Hiring Now: FX Artist 📍 Location: Rohini, Delhi (On-site) 🕒 Type: Full-Time 🏢 Company: Black Diamond Media & Production Pvt Ltd ✨ About the Role: We’re looking for a talented and passionate FX Artist to join our creative team and help us bring stunning visual effects to life for high-quality animated content. You’ll be responsible for simulating and crafting realistic as well as stylized effects including smoke, fire, magic, destruction, liquids, and more. 🛠️ Key Responsibilities: 🔹 Create high-quality visual effects using industry-standard tools 🔹 Collaborate with animators, lighting, and compositing teams 🔹 Simulate particles, dynamics, and fluid effects for animation scenes 🔹 Ensure VFX matches the style and vision of the project 🔹 Optimize FX for performance and rendering pipeline 🔹 Troubleshoot and solve technical issues related to FX production 🎓 Requirements: ✅ 2+ years of experience as an FX Artist in animation/VFX/gaming industry ✅ Proficient in software like Houdini, Maya, Blender, After Effects , or EmberGen ✅ Strong understanding of physics-based simulations ✅ Good knowledge of render engines like Arnold, Redshift, or Octane ✅ Creative thinker with attention to detail ✅ Excellent team player and communication skills 🌟 Nice to Have: ✨ Experience with Unreal Engine/Niagara FX system ✨ Knowledge of Python or MEL scripting for tool development ✨ Familiarity with cartoon-style FX or stylized animation 📩 How to Apply: Send your updated portfolio/showreel and CV to hr@blackdiamonds.co.in Subject line: Application for FX Artist – [Your Name] 🔥 Join us in redefining the Indian animation industry with groundbreaking visuals and creative storytelling! 📽️ Be a part of something extraordinary at Black Diamond Media.

Posted 1 week ago

Apply

4.0 years

4 - 9 Lacs

Gurgaon

On-site

About the Team: Join a highly skilled and collaborative team dedicated to ensuring data reliability, performance, and security across our organization’s critical systems. We work closely with developers, architects, and DevOps professionals to deliver seamless and scalable database solutions in a cloud-first environment, leveraging the latest in AWS and open-source technologies. Our team values continuous learning, innovation, and the proactive resolution of database challenges. About the Role: As a Database Administrator specializing in MySQL and Postgres within AWS environments, you will play a key role in architecting, deploying, and supporting the backbone of our data infrastructure. You’ll leverage your expertise to optimize database instances, manage large-scale deployments, and ensure our databases are secure, highly available, and resilient. This is an opportunity to collaborate across teams, stay ahead with emerging technologies, and contribute directly to our business success. Responsibilities: Design, implement, and maintain MySQL and Postgres database instances on AWS, including managing clustering and replication (MongoDB, Postgres solutions). Write, review, and optimize stored procedures, triggers, functions, and scripts for automated database management. Continuously tune, index, and scale database systems to maximize performance and handle rapid growth. Monitor database operations to ensure high availability, robust security, and optimal performance. Develop, execute, and test backup and disaster recovery strategies in line with company policies. Collaborate with development teams to design efficient and effective database schemas aligned with application needs. Troubleshoot and resolve database issues, implementing corrective actions to restore service and prevent recurrence. Enforce and evolve database security best practices, including access controls and compliance measures. Stay updated on new database technologies, AWS advancements, and industry best practices. Plan and perform database migrations across AWS regions or instances. Manage clustering, replication, installation, and sharding for MongoDB, Postgres, and related technologies. Requirements: 4-7 Years of Experinece in Database Management Systems as a Database Engineer. Proven experience as a MySQL/Postgres Database Administrator in high-availability, production environments. Expertise in AWS cloud services, especially EC2, RDS, Aurora, DynamoDB, S3, and Redshift. In-depth knowledge of DR (Disaster Recovery) setups, including active-active and active-passive master configurations. Hands-on experience with MySQL partitioning and AWS Redshift. Strong understanding of database architectures, replication, clustering, and backup strategies (including Postgres replication & backup). Advanced proficiency in optimizing and troubleshooting SQL queries; adept with performance tuning and monitoring tools. Familiarity with scripting languages such as Bash or Python for automation/maintenance. Experience with MongoDB, Postgres clustering, Cassandra, and related NoSQL or distributed database solutions. Ability to provide 24/7 support and participate in on-call rotation schedules. Excellent problem-solving, communication, and collaboration skills. What we offer? A positive, get-things-done workplace A dynamic, constantly evolving space (change is par for the course – important you are comfortable with this) An inclusive environment that ensures we listen to a diverse range of voices when making decisions. Ability to learn cutting edge concepts and innovation in an agile start-up environment with a global scale Access to 5000+ training courses accessible anytime/anywhere to support your growth and development (Corporate with top learning partners like Harvard, Coursera, Udacity) About us: At PayU, we are a global fintech investor and our vision is to build a world without financial borders where everyone can prosper. We give people in high growth markets the financial services and products they need to thrive. Our expertise in 18+ high-growth markets enables us to extend the reach of financial services. This drives everything we do, from investing in technology entrepreneurs to offering credit to underserved individuals, to helping merchants buy, sell, and operate online. Being part of Prosus, one of the largest technology investors in the world, gives us the presence and expertise to make a real impact. Find out more at www.payu.com Our Commitment to Building A Diverse and Inclusive Workforce As a global and multi-cultural organization with varied ethnicities thriving across locations, we realize that our responsibility towards fulfilling the D&I commitment is huge. Therefore, we continuously strive to create a diverse, inclusive, and safe environment, for all our people, communities, and customers. Our leaders are committed to create an inclusive work culture which enables transparency, flexibility, and unbiased attention to every PayUneer so they can succeed, irrespective of gender, color, or personal faith. An environment where every person feels they belong, that they are listened to, and where they are empowered to speak up. At PayU we have zero tolerance towards any form of prejudice whether a specific race, ethnicity, or of persons with disabilities, or the LGBTQ communities.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurgaon

On-site

Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00001785 Information Technology Job Type Full-Time Posted Date 07/22/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values: At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities United Airlines is seeking talented people to join the Data Engineering team as Sr AWS Redshift DBA. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. As a Redshift DBA, you will engage with Data Engineering and DevOps team (including internal customers) for redshift database administration initiatives. You will be involved in Database administration, performance tuning, management and security of AWS Redshift deployment and enterprise data warehouse You will provide technical support for all database environments including development/pre-production/production databases for the organization and will responsible for setting up of infrastructure, configuring, maintenance of environment including security to support cloud environment working along with DE and Cloud Engineering Develop and implement innovative solutions leading to automation Mentor and train junior engineers This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law.. Qualifications We are seeking Dedicated and skilled Redshift Database Administrator with a proven track record of optimizing database performance, ensuring data security, and implementing scalable solutions. Seeking to leverage expertise in Redshift and AWS to drive efficiency and reliability in a dynamic organization. Required BS/BA, in computer science or related STEM field Individuals who have a natural curiosity and desire to solve problems are encouraged to apply. 4+ years of IT experience preferably in Redshift Database Administration, SQL & Query optimization. Must have experience in performance Tuning & Monitoring. 3+ years of experience in scripting (Python, Bash) Database Security & Compliance. 4+ years in AWS Redshift production environment. 3+ years of experience with relational database systems like Oracle, Teradata 2+ years’ experience with Cloud migration of existing apps to AWS (on premise) Excellent and proven knowledge of Postgres/SQL on Amazon RDS Excellent and proven knowledge of SQL Must have managed Redshift clusters, including provisioning, monitoring, and performance tuning to ensure optimal query execution. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Master’s in Computer Science or related STEM field Experience with cloud-based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data

Posted 1 week ago

Apply

5.0 years

6 - 8 Lacs

Gurgaon

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities This role will be responsible for collaborating with the Business and IT teams to identify the value, scope, features and delivery roadmap for data engineering products and solutions. Responsible for communicating with stakeholders across the board, including customers, business managers, and the development team to make sure the goals are clear and the vision is aligned with business objectives. Perform data analysis using SQL Data Quality Analysis, Data Profiling and Summary reports Trend Analysis and Dashboard Creation based on Visualization technique Execute the assigned projects/ analysis as per the agreed timelines and with accuracy and quality. Complete analysis as required and document results and formally present findings to management Perform ETL workflow analysis, create current/future state data flow diagrams and help the team assess the business impact of any changes or enhancements Understand the existing Python code work books and write pseudo codes Collaborate with key stakeholders to identify the business case/value and create documentation. Should have excellent communication and analytical skills. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications - External Required BE, BTECH or equivalent, in computer science or related STEM field 5+ years of total IT experience as either a Data Analyst/Business Data Analyst or as a Data Engineer 2+ years of experience with Big Data technologies like PySpark, Hadoop, Redshift etc. 3+ years of experience with writing SQL queries on RDBMS or Cloud based database Experience with Visualization tools such as Spotfire, PowerBI, Quicksight etc Experience in Data Analysis and Requirements Gathering Strong problem-solving skills Creative, driven, detail-oriented focus, requiring tackling of tough problems with data and insights. Natural curiosity and desire to solve problems. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Qualifications Preferred AWS Certification preferred Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Primary Responsibilities: Strategic Data Accessibility Leadership: Set the strategic direction for enterprise data accessibility, ensuring consistent and secure access across teams and platforms. Lead the implementation and adoption of data governance tools (e.g., Collibra) to manage metadata, lineage, and data policies. Champion enterprise adoption of semantic and technical metadata practices for improved discoverability and data use. AI/ML Enablement: Oversee the availability, quality, and governance of data used for AI / ML model development and lifecycle management. Ensure that model training, validation, and deployment pipelines have reliable and timely access to governed datasets. Partner with MLOps, engineering, and product teams to embed data accessibility standards in model workflows. Cloud Platform Integration: Oversee data accessibility initiatives in GCP and AWS, including integration with BigQuery, Redshift, and cloud-native storage. Develop strategies for managing access controls, encryption, and auditability of data assets across cloud environments. Data Governance & Quality Oversight: Define and enforce enterprise data quality standards, including data profiling, validation, and exception handling workflows. Ensure compliance with internal data policies and external regulations (e.g., GDPR, HIPAA, CCPA). Lead enterprise initiatives around data lifecycle management, from ingestion and processing to archival and retention. Cross-Functional Collaboration & Leadership: Lead and mentor a team of data operations professionals and collaborate with data engineering, governance, AI / ML, and compliance teams. Provide executive-level insights and recommendations for improving enterprise data accessibility, quality, and governance practices. Drive alignment between business units, technical teams, and compliance functions through effective data stewardship. Skill: 4 to 7 years of experience in data operations, data governance, or data quality management, with at least 3 years in a strategic leadership capacity. Strong hands-on and strategic experience with: Collibra or similar data governance platforms Cloud platforms: Google Cloud Platform (GCP), Amazon Web Services (AWS) Enterprise data warehouses such as Big Query, Redshift, or Snowflake AI / ML model lifecycle support and MLOps integration Data quality frameworks, metadata management, and data access policy enforcement SQL and enterprise-scale ETL/ELT pipelines Strong analytical and problem-solving skills; ability to work across highly matrixed, global organizations. Exceptional communication, leadership, and stakeholder management skills. Bachelor’s or Master’s degree in Data Science, Information Systems, or a related field. Preferred Experience: Experience in Retail or Quick Service Restaurant (QSR) environments with operational and real-time analytics needs. Familiarity with data mesh concepts, data product ownership, and domain-based accessibility strategies. Experience navigating privacy, residency, or regulatory compliance in global data environments. Current GCP Associates (or Professional) Certification. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Years Of Experience : 1Years - 3 years Location: Noida, Indore Requisition Description: Experience in writing and troubleshooting SQL queries Proficient in database and Data warehousing concepts. Proven hands-on experience in designing, developing, and supporting Database projects for analysis Good written and verbal communication skills. Knowledge and/or experience of the following will be added advantage - MDX/ DAX Database design techniques Data modeling SSAS Spark processing Hadoop ecosystem or AWS, Azure, or GCP Cluster and processing Hive Redshift, or Snowflake Linux system Tableau, Micro strategy, Power BI, or any BI tools Programming on Python, Java, or Shell Script Roles and Responsibilities: Interact with senior-most technical and businesspeople of large enterprises to understand their analytics strategy and their problem statements in that area. Understand Customer domain and database schema Designing OLAP semantic models and dashboards Be the go-To person for customers regarding technical issues during the project Efficient task status reporting to stakeholders and customers Be willing to work on off hours to meet the timeline. Be willing to travel or relocate as per project requirement Be willing to work on different technologies

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description Amazon’s ROW (Rest of World) Supply Chain Analytics team is looking for talented Business Intelligence Engineers who develop solutions to better manage/optimize speed and operations planning while providing the best experience to our customers at the lowest possible price. Our team members have an opportunity to be at the forefront of supply chain thought leadership by working on some of the most difficult problems with some of the best research scientists, product/program managers, software developers and business leaders in the industry, shaping our roadmap to drive real impact on Amazon's long-term profitability. We are an agile team, building new analysis from ground up, proposing new concepts and technology to meet business needs, and enjoy and excel at diving into data to analyze root causes and implement long-term solutions. As a BIE within the group, you will analyze massive data sets, identify areas to improve, define metrics to measure and monitor programs, build models to predict and optimize and most importantly work with different stakeholders to drive improvements over time. You will also work closely with internal business teams to extract or mine information from our existing systems to create new analysis, build analytical products and cause impact across wider teams in intuitive ways. This position provides opportunities to influence high visibility/high impact areas in the organization. They are right a lot, work very efficiently, and routinely deliver results on time. They have a global view of the analytical and/or science solutions that they build and consistently think in terms of automating, expanding, and scaling the results broadly. This position also requires you to work across a variety of teams, including transportation, operations, finance, delivery experience, people experience and platform (software) teams. Successful candidates must thrive in fast-paced environments which encourage collaborative and creative problem solving, be able to measure and estimate risks, constructively critique peer research, extract and manipulate data across various data marts, and align research focuses on Amazon’s strategic needs. We are looking for people with a flair for recognizing trends and patterns while correlating it to the business problem at hand. If you have an uncanny ability to decipher the exact policy/mechanism/solution to address the challenge and ability to influence folks using hard data (and some tact) then we are looking for you! Key job responsibilities Analysis of historical data to identify trends and support decision making, including written and verbal presentation of results and recommendations Collaborating with product and software development teams to implement analytics systems and data structures to support large-scale data analysis and delivery of analytical and machine learning models Mining and manipulating data from database tables, simulation results, and log files Identifying data needs and driving data quality improvement projects Understanding the broad range of Amazon’s data resources, which to use, how, and when Thought leadership on data mining and analysis Modeling complex/abstract problems and discovering insights and developing solutions/products using statistics, data mining, science/machine-learning and visualization techniques Helping to automate processes by developing deep-dive tools, metrics, and dashboards to communicate insights to the business teams Collaborating effectively with internal end-users, cross-functional software development teams, and technical support/sustaining engineering teams to solve problems and implement new solutions About The Team ROW (Rest of World) Supply Chain analytics team is hiring multiple BIE roles in speed, planning, inbound and SNOP functions. The role will be responsible for generating insights, defining metrics to measure and monitor, building analytical products, automation and self-serve and overall driving business improvements. The role involves combination of data-analysis, visualization, statistics, scripting, a bit of machine learning and usage of AWS services. Basic Qualifications 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience with forecasting and statistical analysis Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A3022987

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andhra Pradesh

On-site

This position will: Develop, integrate, and deploy Gen AI/Agentic AI solutions leveraging AWS for data engineering and Azure OpenAI for LLM-based applications Translate business and technical requirements into robust, scalable, and maintainable code Collaborate closely with architects, data engineers, and data scientists to deliver high-quality solutions Maintain high standards for code quality, documentation, and testing Continuously learn and apply new tools and techniques in AI/ML and cloud development Support troubleshooting, optimization, and enhancement of existing AI solutions Work collaboratively with ServiceNow and Workday teams to ensure AI components integrate smoothly with the virtual assistant and HR workflows Understand and respect the boundaries of responsibility, providing technical support for integrations without direct development on ServiceNow or Workday Implement Gen AI/Agentic AI use cases for the HR function Develop and maintain data pipelines on AWS for ingestion, transformation, and storage of structured and unstructured HR data Integrate Azure OpenAI LLMs with AWS-based data sources and HR applications, ensuring secure and efficient data flow Build and maintain APIs and microservices to support AI-driven HR processes and system integrations Collaborate with data scientists to operationalize models, including fine-tuning, evaluation, and deployment Write unit, integration, and end-to-end tests to ensure solution reliability and performance Document technical designs, code, and deployment processes for maintainability and knowledge sharing Participate in code reviews, provide constructive feedback, and contribute to team best practices Support the integration of AI outputs with ServiceNow and Workday and other HR solutions by developing and maintaining well-documented APIs and data exchange mechanisms Engage in joint troubleshooting and validation sessions with ServiceNow and Workday teams to resolve integration issues Education: Bachelor’s degree in Computer Science, Engineering, or a related field Experience: 3+ years of experience in AI/ML or software development, with exposure to cloud-based environments Required Qualifications: Proficiency in Python and relevant ML libraries (e.g., PyTorch, TensorFlow, Hugging Face) Experience with AWS data engineering tools (e.g., S3, Glue, Redshift) and pipeline development Hands-on experience integrating and consuming Azure OpenAI or similar LLM APIs Familiarity with RESTful API development and microservices architecture Understanding of data security, privacy, and compliance considerations Experience working in collaborative, multi-team environments, especially on projects with platform integrations Preferred Qualifications: Certifications in AWS or Azure cloud platforms Experience with Gen AI frameworks (e.g., LangChain, LlamaIndex) Knowledge of HR domain processes and integrations with systems like Workday Experience with CI/CD pipelines and DevOps practices Strong problem-solving, analytical, and communication skills Exposure to agile development methodologies Familiarity with ServiceNow and Workday integration concepts is a plus About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Andhra Pradesh

On-site

About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Data Engineering Advisor Position Summary: We are looking for a Databricks Data Engineer to join our Pharmacy Benefit management Clinical space (PBS) Engineering team as part of Care Delivery Solutions. As a Data Engineer, the candidate will work with a highly agile team of developers to develop, execute, validate, and maintain Pharmacy Benefit management Clinical space eco system. The candidate needs to be creative, responsive, flexible, and willing to participate in an open collaborative peer environment and guide the team as necessary. The candidate enjoys working in a team of high performers, who hold each other accountable to perform to their very best and does not shy away from opportunities to provide and take feedback with team members. The candidate works towards delivering a Minimal Viable Product with proper testing, avoids scope creep, and follows Software Engineering best practices as defined by Evernorth. The candidate is expected to actively participate in all ceremonies like Daily Stand-ups, Story grooming, review user stories & sprint retrospectives. About PBS Org: The current PBS Engineering focuses on enabling the product capabilities for PBS business. These include the conceptualization, architecture, design, development and support functions for the Pharmacy Benefit management Clinical space Business Products. The strategic roadmap for PBS focuses on patient activation and routine care for various LOBs of Pharmacy Benefit management Clinical space. Following are the different capabilities of PBS Engineering Organization. Clinilcal Data mart management and development of Integrations with POS and Router applications Development of Non-Clinical Apps Data integrations for Home-based Care Engineering business Data Interoperability Shared Services Capabilities Responsibilities: Work with Solution Architects to drive the definition of the data solution design, mapping business and technical requirements to define data assets that meet both business and operational expectations. Own and manage data models, data design artefacts and provide guidance and consultancy on best practice and standards for customer focused data delivery and data management practices. Be an advocate for data driven design within an agile delivery framework. Plan and implement procedures that will maximize engineering and operating efficiency for application integration technologies. Identify and drive process improvement opportunities. Actively participate in the full project lifecycle from early shaping of high-level estimates and delivery plans through to active governance of the solution as it is developed and built in later phases. Capture and manage risks, issues and assumptions identified through the lifecycle, articulating the financial and other impacts associated with these concerns. Complete accountability for the technology assets owned by the team. Provide leadership to the team ensuring the team is meeting the following objectives: Design, configuration, implementation of middleware products and application design/development within the supported technologies and products. Proactive monitoring and management design of supported assets assuring performance, availability, security, and capacity. Sizes User Stories based on time / difficulty to complete. Provides input on specific challenges facing User Stories. Discuss risks, dependencies, and assumptions. Selects User Stories to be completed in the Iteration, based on User Story priority and team capacity and velocity. Qualifications: Experience of leading data design delivering significant assets to an organization e.g. Data Warehouse, Data Lake, Customer 360 Data Platform. Be able to demonstrate experience within data capabilities such as data modelling, data migration, data quality management, data integration, with a preference for ETL/ELT and data streaming experience. Experience with ETL tools such as Databricks , Apache Airflow, automation of data pipeline processes, AWS , SQL Server, Tableau, Bhoomi, Power BI tool sets. Experience in Python, Java, or Scala. Proficiency in SQL is crucial for database management. Experience with Big Data Technologies like Hadoop, Spark, and Apache Kafka. Experience with Data Warehousing solutions like Amazon Redshift or Google Big Query. Track record of working successfully in a globally dispersed team would be beneficial. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Proven experience with architecture, design, and development of large-scale enterprise application solutions. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Proactive participation in design sessions, Program Increments (PI) Planning and sprint refinement meetings Required Experience & Education: 3 to 5 years of IT experience and 2 to 4 years in a Data Architecture or Data Engineering role is required. College degree (Bachelor) in related technical/business areas or equivalent work experience. Desired Experience: Exposure to serverless AWS Exposure to EKS About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Requirements (AWS Data Engineer) >> 3 – 10 years of strong python or Java data engineering experience Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AWS. Experience in developing ETL, OLAP based and Analytical Applications. Experience in ingesting batch and streaming data from various data sources. Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.) Exposure to AWS platform's data services (AWS Lambda, Glue, Athena, Redshift, Kinesis etc.) Experience in Airflow DAGS, AWS EMR, S3, IAM and other services Experience working on Test cases using pytest/ unit test or any other framework. Snowflake or Redshift data warehouses Experience of DevOps and CD/CD tools. Familiarity with Rest APIs Experience with CI/CD pipelines, branching strategies, & GIT for code management Bachelor's degree in computer science, information technology, or a similar field You will need to be well spoken and have an easy time establishing productive long lasting working relationships with a large variety of stakeholders Take the lead on data pipeline design with strong analytical skills and a keen eye for detail to really understand and tackle the challenges businesses are facing You will be confronted with a large variety of Data Engineer tools and other new technologies as well with a wide variety of IT, compliance, security related issues. Design and develop world-class technology solutions to solve business problems across multiple client engagements. Collaborate with other teams to understand business requirements, client infrastructure, platforms and overall strategy to ensure seamless transitions. Work closely with AI and A team to build world-class solutions and to define AI strategy. You will possess strong logical structuring and problem-solving skills with expert level understanding of database and have an inherent desire to turn data into actions. Strong verbal, written and presentation skills Comfortable working in Agile projects Clear and precise communication skills Ability to quickly learn and develop expertise in existing highly complex applications and architectures.

Posted 1 week ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Summary We are looking for a skilled AWS Data Engineer with strong experience in building and managing cloud-based ETL pipelines using AWS Glue, Python/PySpark, and Athena, along with data warehousing expertise in Amazon Redshift. The ideal candidate will be responsible for designing, developing, and maintaining scalable data solutions in a cloud-native environment. Design and implement ETL workflows using AWS Glue, Python, and PySpark. Develop and optimize queries using Amazon Athena and Redshift. Build scalable data pipelines to ingest, transform, and load data from various sources. Ensure data quality, integrity, and security across AWS services. Collaborate with data analysts, data scientists, and business stakeholders to deliver data solutions. Monitor and troubleshoot ETL jobs and cloud infrastructure performance. Automate data workflows and integrate with CI/CD pipelines. Required Skills & Qualifications Hands-on experience with AWS Glue, Athena, and Redshift. Strong programming skills in Python and PySpark. Experience with ETL design, implementation, and optimization. Familiarity with S3, Lambda, CloudWatch, and other AWS services. Understanding of data warehousing concepts and performance tuning in Redshift. Experience with schema design, partitioning, and query optimization in Athena. Proficiency in version control (Git) and agile development practices.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary We are looking for a skilled AWS Data Engineer with strong experience in building and managing cloud-based ETL pipelines using AWS Glue, Python/PySpark, and Athena, along with data warehousing expertise in Amazon Redshift. The ideal candidate will be responsible for designing, developing, and maintaining scalable data solutions in a cloud-native environment. Design and implement ETL workflows using AWS Glue, Python, and PySpark. Develop and optimize queries using Amazon Athena and Redshift. Build scalable data pipelines to ingest, transform, and load data from various sources. Ensure data quality, integrity, and security across AWS services. Collaborate with data analysts, data scientists, and business stakeholders to deliver data solutions. Monitor and troubleshoot ETL jobs and cloud infrastructure performance. Automate data workflows and integrate with CI/CD pipelines. Required Skills & Qualifications Hands-on experience with AWS Glue, Athena, and Redshift. Strong programming skills in Python and PySpark. Experience with ETL design, implementation, and optimization. Familiarity with S3, Lambda, CloudWatch, and other AWS services. Understanding of data warehousing concepts and performance tuning in Redshift. Experience with schema design, partitioning, and query optimization in Athena. Proficiency in version control (Git) and agile development practices.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Responsibilities / Qualifications Working experience in Python pyspark and Databricks Candidate must have 3 to 5 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Glue Data Catalog, Lake formation, Apache Airflow, Lambda, etc Experience with development of data governance framework including the management of data, operating model, data policies and standards. Experience with orchestration of workflows in an enterprise environment. Working experience with Agile Methodology Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team.

Posted 1 week ago

Apply

7.0 years

32 - 35 Lacs

Bhubaneswar, Odisha, India

Remote

Experience : 7.00 + years Salary : INR 3200000-3500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Socialtrait) (*Note: This is a requirement for one of Uplers' client - California based AI-driven insights and audience analytics agency) What do you need for this opportunity? Must have skills required: BI Products, BigQuery, Embedded AI into Saas products, Predictive Analytics, PowerBI, Snowflake, Google Cloud Platform, Python, SQL, M - Code California based AI-driven insights and audience analytics agency is Looking for: Senior Power BI & Consumer Insights Specialist Remote Full-time Data & Insights Why this role matters Socialtrait’s AI platform captures millions of real-time consumer signals through virtual AI communities. Socialtrait AI is a fast-growing analytics and intelligence platform helping brands understand their audience, performance, and competitors across digital and social channels. We're driven by data and obsessed with delivering actionable insights that make an impact. We need a builder who can transform those streams into razor-sharp dashboards that brand, product, and marketing teams act on daily. You’ll be the go-to Power BI expert, owning the full build-run-optimise cycle of dashboards that guide C-level decisions for global consumer brands—no line management, pure impact. What You’ll Do Design & ship dashboards end-to-end – wireframe, model, develop, and deploy Power BI workspaces that surface campaign performance, competitive moves, social buzz, and conversion KPIs in minutes, not weeks. Tell insight-rich stories – turn data into narratives that brand managers, CMOs, and product teams can take to the board. Engineer robust data models – build scalable semantic layers across SQL warehouses (BigQuery, Snowflake, Redshift) and behavioural APIs. Push Power BI to its limits – advanced DAX, M-code, incremental refresh, and performance tuning so reports load in under three seconds. Embed with clients & stakeholders – join working sessions with Fortune 500 insights teams; translate hypotheses into metrics and experiments. Prototype the future – pilot AI-assisted insight generation, embedded analytics, and real-time sentiment widgets. The calibre we’re after 7+ years crafting enterprise BI products, 4+ years deep in Power BI. Proven success delivering dashboards for consumer-facing organisations (CPG, retail, media, fintech, or D2C) where insights directly shaped product or campaign strategy. Master-level DAX, Power Query, and SQL; comfortable scripting in Python or R for heavier modelling. Fluency with cloud data platforms. Demonstrated ability to influence executives through data—your dashboards have redirected budgets or product roadmaps. Bonus: predictive analytics, time-series forecasting, or embedding BI into SaaS products. How We’ll Support You Competitive salary + meaningful equity upside. A culture that values truthful insights over buzzwords—your work becomes the daily heartbeat of decision-making. Our hiring process Intro chat (30 min) – mutual fit & mission alignment. Technical deep-dive – walk us through a dashboard you’re proud of (screenshare). Case challenge – you redesign a key view from an anonymised consumer dataset in Power BI and discuss your choices. Exec panel – strategy discussion with CEO, COO, and Head of Product. Offer & roadmap session – align on your first-90-day impact plan. Ready to build the dashboards that power the next wave of consumer-insight AI? Let’s talk How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

7.0 years

32 - 35 Lacs

Kolkata, West Bengal, India

Remote

Experience : 7.00 + years Salary : INR 3200000-3500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Socialtrait) (*Note: This is a requirement for one of Uplers' client - California based AI-driven insights and audience analytics agency) What do you need for this opportunity? Must have skills required: BI Products, BigQuery, Embedded AI into Saas products, Predictive Analytics, PowerBI, Snowflake, Google Cloud Platform, Python, SQL, M - Code California based AI-driven insights and audience analytics agency is Looking for: Senior Power BI & Consumer Insights Specialist Remote Full-time Data & Insights Why this role matters Socialtrait’s AI platform captures millions of real-time consumer signals through virtual AI communities. Socialtrait AI is a fast-growing analytics and intelligence platform helping brands understand their audience, performance, and competitors across digital and social channels. We're driven by data and obsessed with delivering actionable insights that make an impact. We need a builder who can transform those streams into razor-sharp dashboards that brand, product, and marketing teams act on daily. You’ll be the go-to Power BI expert, owning the full build-run-optimise cycle of dashboards that guide C-level decisions for global consumer brands—no line management, pure impact. What You’ll Do Design & ship dashboards end-to-end – wireframe, model, develop, and deploy Power BI workspaces that surface campaign performance, competitive moves, social buzz, and conversion KPIs in minutes, not weeks. Tell insight-rich stories – turn data into narratives that brand managers, CMOs, and product teams can take to the board. Engineer robust data models – build scalable semantic layers across SQL warehouses (BigQuery, Snowflake, Redshift) and behavioural APIs. Push Power BI to its limits – advanced DAX, M-code, incremental refresh, and performance tuning so reports load in under three seconds. Embed with clients & stakeholders – join working sessions with Fortune 500 insights teams; translate hypotheses into metrics and experiments. Prototype the future – pilot AI-assisted insight generation, embedded analytics, and real-time sentiment widgets. The calibre we’re after 7+ years crafting enterprise BI products, 4+ years deep in Power BI. Proven success delivering dashboards for consumer-facing organisations (CPG, retail, media, fintech, or D2C) where insights directly shaped product or campaign strategy. Master-level DAX, Power Query, and SQL; comfortable scripting in Python or R for heavier modelling. Fluency with cloud data platforms. Demonstrated ability to influence executives through data—your dashboards have redirected budgets or product roadmaps. Bonus: predictive analytics, time-series forecasting, or embedding BI into SaaS products. How We’ll Support You Competitive salary + meaningful equity upside. A culture that values truthful insights over buzzwords—your work becomes the daily heartbeat of decision-making. Our hiring process Intro chat (30 min) – mutual fit & mission alignment. Technical deep-dive – walk us through a dashboard you’re proud of (screenshare). Case challenge – you redesign a key view from an anonymised consumer dataset in Power BI and discuss your choices. Exec panel – strategy discussion with CEO, COO, and Head of Product. Offer & roadmap session – align on your first-90-day impact plan. Ready to build the dashboards that power the next wave of consumer-insight AI? Let’s talk How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

7.0 years

32 - 35 Lacs

Raipur, Chhattisgarh, India

Remote

Experience : 7.00 + years Salary : INR 3200000-3500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Socialtrait) (*Note: This is a requirement for one of Uplers' client - California based AI-driven insights and audience analytics agency) What do you need for this opportunity? Must have skills required: BI Products, BigQuery, Embedded AI into Saas products, Predictive Analytics, PowerBI, Snowflake, Google Cloud Platform, Python, SQL, M - Code California based AI-driven insights and audience analytics agency is Looking for: Senior Power BI & Consumer Insights Specialist Remote Full-time Data & Insights Why this role matters Socialtrait’s AI platform captures millions of real-time consumer signals through virtual AI communities. Socialtrait AI is a fast-growing analytics and intelligence platform helping brands understand their audience, performance, and competitors across digital and social channels. We're driven by data and obsessed with delivering actionable insights that make an impact. We need a builder who can transform those streams into razor-sharp dashboards that brand, product, and marketing teams act on daily. You’ll be the go-to Power BI expert, owning the full build-run-optimise cycle of dashboards that guide C-level decisions for global consumer brands—no line management, pure impact. What You’ll Do Design & ship dashboards end-to-end – wireframe, model, develop, and deploy Power BI workspaces that surface campaign performance, competitive moves, social buzz, and conversion KPIs in minutes, not weeks. Tell insight-rich stories – turn data into narratives that brand managers, CMOs, and product teams can take to the board. Engineer robust data models – build scalable semantic layers across SQL warehouses (BigQuery, Snowflake, Redshift) and behavioural APIs. Push Power BI to its limits – advanced DAX, M-code, incremental refresh, and performance tuning so reports load in under three seconds. Embed with clients & stakeholders – join working sessions with Fortune 500 insights teams; translate hypotheses into metrics and experiments. Prototype the future – pilot AI-assisted insight generation, embedded analytics, and real-time sentiment widgets. The calibre we’re after 7+ years crafting enterprise BI products, 4+ years deep in Power BI. Proven success delivering dashboards for consumer-facing organisations (CPG, retail, media, fintech, or D2C) where insights directly shaped product or campaign strategy. Master-level DAX, Power Query, and SQL; comfortable scripting in Python or R for heavier modelling. Fluency with cloud data platforms. Demonstrated ability to influence executives through data—your dashboards have redirected budgets or product roadmaps. Bonus: predictive analytics, time-series forecasting, or embedding BI into SaaS products. How We’ll Support You Competitive salary + meaningful equity upside. A culture that values truthful insights over buzzwords—your work becomes the daily heartbeat of decision-making. Our hiring process Intro chat (30 min) – mutual fit & mission alignment. Technical deep-dive – walk us through a dashboard you’re proud of (screenshare). Case challenge – you redesign a key view from an anonymised consumer dataset in Power BI and discuss your choices. Exec panel – strategy discussion with CEO, COO, and Head of Product. Offer & roadmap session – align on your first-90-day impact plan. Ready to build the dashboards that power the next wave of consumer-insight AI? Let’s talk How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

7.0 years

32 - 35 Lacs

Ranchi, Jharkhand, India

Remote

Experience : 7.00 + years Salary : INR 3200000-3500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Socialtrait) (*Note: This is a requirement for one of Uplers' client - California based AI-driven insights and audience analytics agency) What do you need for this opportunity? Must have skills required: BI Products, BigQuery, Embedded AI into Saas products, Predictive Analytics, PowerBI, Snowflake, Google Cloud Platform, Python, SQL, M - Code California based AI-driven insights and audience analytics agency is Looking for: Senior Power BI & Consumer Insights Specialist Remote Full-time Data & Insights Why this role matters Socialtrait’s AI platform captures millions of real-time consumer signals through virtual AI communities. Socialtrait AI is a fast-growing analytics and intelligence platform helping brands understand their audience, performance, and competitors across digital and social channels. We're driven by data and obsessed with delivering actionable insights that make an impact. We need a builder who can transform those streams into razor-sharp dashboards that brand, product, and marketing teams act on daily. You’ll be the go-to Power BI expert, owning the full build-run-optimise cycle of dashboards that guide C-level decisions for global consumer brands—no line management, pure impact. What You’ll Do Design & ship dashboards end-to-end – wireframe, model, develop, and deploy Power BI workspaces that surface campaign performance, competitive moves, social buzz, and conversion KPIs in minutes, not weeks. Tell insight-rich stories – turn data into narratives that brand managers, CMOs, and product teams can take to the board. Engineer robust data models – build scalable semantic layers across SQL warehouses (BigQuery, Snowflake, Redshift) and behavioural APIs. Push Power BI to its limits – advanced DAX, M-code, incremental refresh, and performance tuning so reports load in under three seconds. Embed with clients & stakeholders – join working sessions with Fortune 500 insights teams; translate hypotheses into metrics and experiments. Prototype the future – pilot AI-assisted insight generation, embedded analytics, and real-time sentiment widgets. The calibre we’re after 7+ years crafting enterprise BI products, 4+ years deep in Power BI. Proven success delivering dashboards for consumer-facing organisations (CPG, retail, media, fintech, or D2C) where insights directly shaped product or campaign strategy. Master-level DAX, Power Query, and SQL; comfortable scripting in Python or R for heavier modelling. Fluency with cloud data platforms. Demonstrated ability to influence executives through data—your dashboards have redirected budgets or product roadmaps. Bonus: predictive analytics, time-series forecasting, or embedding BI into SaaS products. How We’ll Support You Competitive salary + meaningful equity upside. A culture that values truthful insights over buzzwords—your work becomes the daily heartbeat of decision-making. Our hiring process Intro chat (30 min) – mutual fit & mission alignment. Technical deep-dive – walk us through a dashboard you’re proud of (screenshare). Case challenge – you redesign a key view from an anonymised consumer dataset in Power BI and discuss your choices. Exec panel – strategy discussion with CEO, COO, and Head of Product. Offer & roadmap session – align on your first-90-day impact plan. Ready to build the dashboards that power the next wave of consumer-insight AI? Let’s talk How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies