Home
Jobs

1759 Redshift Jobs - Page 35

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Specialist - Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R334900 Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities On your first day, we'll expect you to have: BS in Computer Science or equivalent experience with 3+ years as a Data Engineer or a similar role Programming skills in Python & Java (good to have) Design data models for storage and retrieval to meet product and requirements Build scalable data pipelines using Spark, Airflow, AWS data services (Redshift, Athena, EMR), Apache projects (Spark, Flink, Hive, and Kafka) Familiar with modern software development practices (Agile, TDD, CICD) applied to data engineering Enhance data quality through internal tools/frameworks detecting DQ issues. Working knowledge of relational databases and SQL query authoring We'd Be Super Excited If You Have Followed a Kappa architecture with any of your previous deployments and domain knowledge of Finance/Financial Systems Qualifications Our perks & benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities On your first day, we'll expect you to have: BS in Computer Science or equivalent experience with 3+ years as a Data Engineer or a similar role Programming skills in Python & Java (good to have) Design data models for storage and retrieval to meet product and requirements Build scalable data pipelines using Spark, Airflow, AWS data services (Redshift, Athena, EMR), Apache projects (Spark, Flink, Hive, and Kafka) Familiar with modern software development practices (Agile, TDD, CICD) applied to data engineering Enhance data quality through internal tools/frameworks detecting DQ issues. Working knowledge of relational databases and SQL query authoring We'd Be Super Excited If You Have Followed a Kappa architecture with any of your previous deployments and domain knowledge of Finance/Financial Systems Qualifications Our perks & benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description We are looking for FE Sr Software Engineer who is responsible for designing, developing, building, testing, and deploying web tier applications along with AWS technologies background to work with our new platform and to support maintenance activities. ECS Engineers are hands-on coders as well as high-level designers and thinkers - they are responsible for developing a deep understanding of our existing platform and transforming into new technologies for maintainability in a high-activity environment, and for designing the integrations between various systems. Offer insights and analysis on existing codebase. Performance considerations are second nature. Works effectively in a collaborative environment and you will report to manager. Participates in the estimates for projects based on interface wireframes/desired functionality. Develop re-usable patterns and encourage innovation that will improve team velocity. Researches emerging topics related to assigned tasks and come up with good solutions. Participates in all pertinent project meetings. Prioritizes assigned tasks and keeps manager up to date on status and roadblocks. Experience on technologies like server-side rendering and MFE's. Strong understanding of unit testing, integration testing- Jest / react-testing-library. Experience building secure software. Experience implementing Cloud/Hybrid Cloud AWS solutions. Experience in configuration management using GIT, issue tracking, estimation, and Agile practices Positive, active member of a team, who opinions in a constructive manner. Networks with senior members in their area of expertise. Qualifications Bachelor's degree or equivalent experience. Typically requires 6 to 8 years related experience. You shall have 6+ years' experience with front-end development using JavaScript, ES6, Typescript, HTML, CSS/SCSS, responsive design. You shall have 4+ years' experience building responsive, single page web applications using React/Redux, React Hooks experience. Strong understanding of unit testing, integration testing- Jest / react-testing-library. Familiarity with modern front-end build pipelines and tools. Experience building secure software. Works effectively in a collaborative. Experience with a data warehouse like Snowflake, Redshift, or Spark Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer the best family well-being benefits, Enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Engineer Location: Tardeo, Mumbai YOE: 2-5 yrs Notice period: Need IMMEDIATE JOINERS ONLY (in-office drive tomorrow) Must Have : Bachelors/Masters, Preferably in Computer Science or a related technical field. 2-5 years of relevant experience. Deep knowledge and working experience of Kafka ecosystem . Good programming experience, preferably in Python, Java, Go , and a willingness to learn more. Experience in working with large sets of data platforms . Strong knowledge of microservices, data warehouse, and data lake systems in the cloud, especially AWS Redshift, S3, and Glue. Strong hands-on experience in writing complex and efficient ETL jobs . Experience in version management systems (preferably with Git ). Strong analytical thinking and communication. Passion for finding and sharing best practices and driving discipline for superior data quality and integrity. Intellectual curiosity to find new and unusual ways of how to solve data management issues. Responsibilities : Design and build systems to efficiently move data across multiple systems and make it available for various teams like Data Science, Data Analytics, and Product. Design, construct, test, and maintain data management systems. Understand data and business metrics required by the product and architect the systems to make that data available in a usable/queryable manner. Ensure that all systems meet the business/company requirements as well as best industry practices. Keep ourselves abreast of new technologies in our domain. Recommend different ways to constantly improve data reliability and quality. Note: Budget for this role is on the lower side. Please apply according to your current compensation. Kindly read about Recro before applying. Show more Show less

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Location- All EXL Locations Experience- 10 to 15 Years Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyze, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory , Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Must have: Writing code in programming language & working experience in Python, Pyspark, Databricks, Scala or Similar Data Pipeline Development & Management Design, develop, and maintain ETL (Extract, Transform, Load) pipelines using AWS services like AWS Glue, AWS Data Pipeline, Lambda, and Step Functions . Implement incremental data processing using tools like Apache Spark (EMR), Kinesis, and Kafka. Work with AWS data storage solutions such as Amazon S3, Redshift, RDS, DynamoDB, and Aurora. Optimize data partitioning, compression, and indexing for efficient querying and cost optimization. Implement data lake architecture using AWS Lake Formation & Glue Catalog. Implement CI/CD pipelines for data workflows using Code Pipeline, Code Build, and GitHub Actions Good to have: Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar. Key skills: key Skills: Python, Pyspark, AWS, Databricks, SQL. Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Summary We are seeking a highly experienced and strategic Lead Data Architect with 8+ years of hands-on experience in designing and leading data architecture initiatives. This individual will play a critical role in building scalable, secure, and high-performance data solutions that support enterprise-wide analytics, reporting, and operational systems. The ideal candidate will be both technically proficient and business-savvy, capable of translating complex data needs into innovative architecture designs. Key Responsibilities Design and implement enterprise-wide data architecture to support business intelligence, advanced analytics, and operational data needs. Define and enforce standards for data modeling, integration, quality, and governance. Lead the adoption and integration of modern data platforms (data lakes, data warehouses, streaming, etc.). Develop architecture blueprints, frameworks, and roadmaps aligned with business objectives. Ensure data security, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Collaborate with business, engineering, and analytics teams to deliver high-impact data solutions. Provide mentorship and technical leadership to data engineers and junior architects. Evaluate emerging technologies and provide recommendations for future-state architectures. Required Qualifications 8+ years of experience in data architecture, data engineering, or a similar senior technical role. Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field. Expertise in designing and managing large-scale data systems using cloud platforms (AWS, Azure, or GCP). Strong proficiency in data modeling (relational, dimensional, NoSQL) and modern database systems (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with data integration tools (e.g., Apache NiFi, Talend, Informatica) and orchestration tools (e.g., Airflow). In-depth knowledge of data governance, metadata management, and data cataloging solutions. Experience with real-time and batch data processing frameworks, including streaming technologies like Kafka. Excellent leadership, communication, and cross-functional collaboration skills. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Overview: CashKaro is India’s #1 cashback platform, trusted by over 25 million users! We drive more sales for Amazon, Flipkart, Myntra, and Ajio than any other paid channel, including Google and Meta. Backed by legendary investor Ratan Tata and a recent $16 million boost from Affle, we’re on a rocket ship journey—already surpassing ₹300 crore in revenue and racing towards ₹500 crore. EarnKaro, our influencer referral platform, is trusted by over 500,000 influencers and sends more traffic to leading online retailers than any other platform. Whether it’s micro-influencers or top-tier creators, they choose EarnKaro to monetize their networks. BankKaro, our latest venture, is rapidly becoming India’s go-to FinTech aggregator. Join our dynamic team and help shape the future of online shopping, influencer marketing, and financial technology in India! Role Overview: As a Product Analyst, you will play a pivotal role in enabling data-driven product decisions. You will be responsible for deep-diving into product usage data, building dashboards and reports, optimizing complex queries, and driving feature-level insights that directly influence user engagement, retention, and experience. Key Responsibilities: Feature Usage & Adoption Analysis - Analyze event data to understand feature usage, retention trends, and product interaction patterns across web and app. User Journey & Funnel Analysis - Build funnel views and dashboards to identify drop-offs, friction points, and opportunities for UX or product improvements. Product Usage & Retention Analytics - Analyze user behavior, cohort trends, and retention using Redshift and BigQuery datasets. Partner with Product Managers to design and track core product KPIs. SQL Development & Optimization - Write and optimize complex SQL queries across Redshift and BigQuery. Build and maintain views, stored procedures, and data models for scalable analytics. Dashboarding & BI Reporting - Create and maintain high-quality Power BI dashboards to track DAU/WAU/MAU, feature adoption, engagement %, and drop-off trends. Light Data Engineering - Use Python (Pandas/Numpy) for data cleaning, transformation, and quick exploratory analysis. Business Insight Generation - Translate business questions into structured analyses and insights that inform product and business strategy. Must-Have Skills: Expert-level SQL across Redshift and BigQuery, including performance tuning, window functions, and procedure creation. Strong skills in Power BI (or Tableau) with ability to build actionable, intuitive dashboards. Working knowledge of Python (Pandas) for quick data manipulation and ad-hoc analytics. Deep understanding of product metrics – DAU, retention, feature usage, funnel performance. Strong business acumen – ability to connect data with user behavior and product outcomes. Clear communication and storytelling skills to present data insights to cross-functional teams. Good to Have: Experience with mobile product analytics (Android & iOS). Understanding of funnel, cohort, engagement, and retention metrics. Familiarity with A/B testing tools and frameworks. Experience working with Redshift, Big Query, or cloud-based data pipelines. Certifications in Google Analytics, Firebase, or other analytics platforms. Why Join Us? High Ownership: Drive key metrics for products used by millions. Collaborative Culture: Work closely with founders, product, and tech teams. Competitive Package: Best-in-class compensation, ESOPs, and perks. Great Environment: Hybrid work, medical insurance, lunches, and learning budgets. Ensuring a Diverse and Inclusive workplace where we learn from each other is core to CK's value. CashKaro.com and EarnKaro.com are Equal Employment Opportunity and Affirmative Action Employers. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status. CashKaro.com and EarnKaro.com will not pay any third-party agency or company that does not have a signed agreement with CashKaro.com and EarnKaro.com. Pouring Pounds India Pvt. Ltd. will not pay any third-party agency or company that does not have a signed agreement with CashKaro.com and EarnKaro.com. Visit our Career Page at - https://cashkaro.com/page/careers Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Title: Database Engineer X 8 Positions Location: Hyderabad, India Salary: Market Rate/Negotiable About us Creditsafe is the most used business data provider in the world, reducing risk and maximizing opportunities for our 110,000 business customers. Our journey began in Oslo, Norway in 1997, where we had a dream of using the then revolutionary internet to deliver instant access company credit reports to small and medium-sized businesses. Creditsafe realized this dream and changed the market for the better for businesses of all sizes. From there, we opened 15 more offices throughout Europe, the USA and Asia. We provide data on more than 300 million companies and provide customer notifications for billions of changes annually. We are a high growth company offering the freedom and flexibility of a start-up type culture due to the continuous innovation and new product development performed, coupled with the stability of being a profitable and growing company! With such a large customer base and breadth of data and analytics technology you will have real opportunities to help companies survive and thrive in challenging times by reducing business risk and choosing trustworthy customers and suppliers. Summary: This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. As a Database Engineer with excellent database development skills, you will be responsible for developing and maintaining the databases and scripts that power the company’s products and websites, handling large data sets and having more than 20 million hits per day. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality. Primary Responsibilities: · 5+ year’s solid commercial experience of Oracle development under a 10g or 11g environment. · Advanced PL/SQL knowledge required. · ETL skills – Pentaho would be beneficial · Any wider DB experience would be desirable e.g., Redshift, Aurora DB, DynamoDB, MariaDB, MongoDB etc. · Cloud/AWS An interest in learning new technologies. · Experience in tuning Oracle queries in large databases. · Good experience in loading and extracting large data sets. · Experience of working with an Oracle database under a bespoke web development environment. · Analytical and critical thinking skills; agile problem-solving abilities. · Detail oriented, self-motivated, able to work independently with little or no supervision, and is committed to the highest standards of quality for the entire release process. · Excellent written and verbal communication skills. · Attention to detail. · Ability to work in a fast paced / changing environment. · Ability to thrive in a deadline driven, stressful project environment.3+ years of software development experience. Qualifications and Experience · Degree in Computer Science or similar. · Experience with loading data through SSIS. · Experience working on financial and business intelligence projects or in big data environments. · A desire to learn new skills and branch into development using a wide range of alternative technologies. Skills, Knowledge and Abilities · Write code for new development requirements as well as provide bug fixing, support and maintenance of existing code. · Test your code to ensure it functions as per the business requirements, considering the impact of your code on other areas of the solution. · Provide expert advice on performance tuning within Oracle. · Perform large-scale imports and extracts of data. · Assist the business in the collection and documentation of user's requirements where needed, provide estimates and work plans · Create and maintain technical documentation. · Follow all company procedures/standards/processes. · Contribute to architectural design and development making technically sound development recommendations. · Provide support to other staff in the department and act as a mentor to less experienced staff, including through code reviews. · Work as a team player in an agile environment. · Build release scripts and plans to facilitate the deployment of your code to testing and production environments. · Take ownership of any issues that occur within your area to ensure an appropriate solution is found. Assess opportunities for application and process improvement and share with team members and/or affected parties. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

Do you want to make a global impact on patient health? Join Pfizer Digital’s Artificial Intelligence, Data, and Advanced Analytics organization (AIDA) to leverage cutting-edge technology for critical business decisions and enhance customer experiences for colleagues, patients, and physicians. Our team is at the forefront of Pfizer’s transformation into a digitally driven organization, using data science and AI to change patients’ lives. The Data Science Industrialization team leads engineering efforts to advance AI and data science applications from POCs and prototypes to full production. As a Senior Manager, AI and Analytics Data Engineer, you will be part of a global team responsible for designing, developing, and implementing robust data layers that support data scientists and key advanced analytics/AI/ML business solutions. You will partner with cross-functional data scientists and Digital leaders to ensure efficient and reliable data flow across the organization. You will lead development of data solutions to support our data science community and drive data-centric decision-making. Join our diverse team in making an impact on patient health through the application of cutting-edge technology and collaboration. Role Responsibilities Lead development of data engineering processes to support data scientists and analytics/AI solutions, ensuring data quality, reliability, and efficiency As a data engineering tech lead, enforce best practices, standards, and documentation to ensure consistency and scalability, and facilitate related trainings Provide strategic and technical input on the AI ecosystem including platform evolution, vendor scan, and new capability development Act as a subject matter expert for data engineering on cross functional teams in bespoke organizational initiatives by providing thought leadership and execution support for data engineering needs Train and guide junior developers on concepts such as data modeling, database architecture, data pipeline management, data ops and automation, tools, and best practices Stay updated with the latest advancements in data engineering technologies and tools and evaluate their applicability for improving our data engineering capabilities Direct data engineering research to advance design and development capabilities Collaborate with stakeholders to understand data requirements and address them with data solutions Partner with the AIDA Data and Platforms teams to enforce best practices for data engineering and data solutions Demonstrate a proactive approach to identifying and resolving potential system issues. Communicate the value of reusable data components to end-user functions (e.g., Commercial, Research and Development, and Global Supply) and promote innovative, scalable data engineering approaches to accelerate data science and AI work Basic Qualifications Bachelor's degree in computer science, information technology, software engineering, or a related field (Data Science, Computer Engineering, Computer Science, Information Systems, Engineering, or a related discipline). 7+ years of hands-on experience in working with SQL, Python, object-oriented scripting languages (e.g. Java, C++, etc..) in building data pipelines and processes. Proficiency in SQL programming, including the ability to create and debug stored procedures, functions, and views. Recognized by peers as an expert in data engineering with deep expertise in data modeling, data governance, and data pipeline management principles In-depth knowledge of modern data engineering frameworks and tools such as Snowflake, Redshift, Spark, Airflow, Hadoop, Kafka, and related technologies Experience working in a cloud-based analytics ecosystem (AWS, Snowflake, etc.) Familiarity with machine learning and AI technologies and their integration with data engineering pipelines Demonstrated experience interfacing with internal and external teams to develop innovative data solutions Strong understanding of Software Development Life Cycle (SDLC) and data science development lifecycle (CRISP) Highly self-motivated to deliver both independently and with strong team collaboration Ability to creatively take on new challenges and work outside comfort zone. Strong English communication skills (written & verbal) Preferred Qualifications Advanced degree in Data Science, Computer Engineering, Computer Science, Information Systems, or a related discipline (preferred, but not required) Experience in software/product engineering Experience with data science enabling technology, such as Dataiku Data Science Studio, AWS SageMaker or other data science platforms Familiarity with containerization technologies like Docker and orchestration platforms like Kubernetes. Experience working effectively in a distributed remote team environment Hands on experience working in Agile teams, processes, and practices Expertise in cloud platforms such as AWS, Azure or GCP. Proficiency in using version control systems like Git. Pharma & Life Science commercial functional knowledge Pharma & Life Science commercial data literacy Ability to work non-traditional work hours interacting with global teams spanning across the different regions (e.g.: North America, Europe, Asia) Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Calfus Calfus is a Silicon Valley headquartered software engineering and platforms company. The name Calfus finds its roots and ethos in the Olympic motto “Citius, Altius, Fortius – Communiter". Calfus seeks to inspire our team to rise faster, higher, stronger, and work together to build software at speed and scale. Our core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes. We stand for #Equity and #Diversity in our ecosystem and society at large. Connect with us at #Calfus and be a part of our extraordinary journey! Position Overview: As a Data Engineer – BI Analytics & DWH , you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower our organization to make data-driven decisions. You will leverage your expertise in Power BI, Tableau, and ETL processes to create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities:  BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau.  Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses.  Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives.  Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors.  Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization.  Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs.  Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability.  Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance.  Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement.  Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications :  Bachelor’s degree in computer science, Information Systems, Data Science, or a related field.  6-15+ years of experience in BI architecture and development, with a strong focus on Power BI and Tableau.  Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management.  Exploratory data analysis with Python  Familiarity with the CRISP-DM model  Ability to work with di􀆯erent data models like  Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB  Experience with visualization tools such as Power BI, Quick sight and Plotly and or Dash  Strong programming foundation with Python with versatility to handle as : Data Manipulation and Analysis: using Pandas, NumPy & PySpark Data serialization & formats like JSON, CSV and Parquet & Pickle Database interaction to query cloud-based data warehouses Data Pipeline and ETL tools like Airflow for orchestrating workflows and, managing ETL pipelines: Scripting and automation . Cloud services & tools such as S3, AWS Lambda to manage cloud infrastructure. Azure SDK is a plus  Code quality and management using version control and collaboration in data engineering projects  Ability to interact with REST API’s and perform web scraping tasks is a plus Calfus Inc. is an Equal Opportunity Employer. That means we do not discriminate against any applicant for employment, or any employee because of age, colour, sex, disability, national origin, race, religion, or veteran status. All employment is decided based on qualifications, merit, and business need. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Summary: Analyzes the data needs of enterprise to build, optimize and maintain conceptual ML / Analytics models. Data scientist provides expertise in modeling & statistical approaches ranging from regression methods, decision trees, deep learning, NLP techniques, uplift modeling; statistical modeling such as multivariate techniques. Roles & Responsibilities : Design ML design and Ops stack considering the various trade-offs. Statistical Analysis and fundamentals MLOPS frameworks design and implementation Model Evaluation best practices -Train and retrain systems when necessary. Extend existing ML libraries and frameworks -Keep abreast of developments in the field. Act as a SME and tech lead / veteran for any data engineering question and manage data scientists and influence DS development across the company. Promote services, contribute to the identification of innovative initiatives within the Group, share information on new technologies in dedicated internal communities. Ensure compliance with policies related to Data Management and Data Protection Preferred Experience: Strong experience (3+ years) with Building statistical models, applying machine learning techniques Experience (3+ years) on Big Data technologies such as Hadoop, Spark, Airflow/Databricks Proven experience (3+ years) in solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. Proven experience (3+ years) on innovation implementation from exploration to production: these may include containerization (i.e. Docker/Kubernetes), Big data (Hadoop, Spark) and MLOps platforms. Deep understanding of E2E software development in a team, and a track record of shipping software on time Ensure high-quality data and understand how data, which is generated out experimental design can produce actionable, trustworthy conclusions. Proficiency with SQL and NoSQL databases, data warehousing concepts, and cloud-based analytics database (e.g. Snowflake , Databricks or Redshift) administration Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We’re Hiring: Senior Associate – Field Force Operations at Chryselys Location: Hyderabad Job Type: Full-time About Us: Chryselys is a Pharma Analytics & Business consulting company that delivers data-driven insights leveraging AI-powered, cloud-native platforms to achieve high-impact transformations. We specialize in digital technologies and advanced data science techniques that provide strategic and operational insights. Who we are: People - Our team of industry veterans, advisors and senior strategists have diverse backgrounds and have worked at top tier companies. Quality - Our goal is to deliver the value of a big five consulting company without the big five cost. Technology - Our solutions are Business centric built on cloud native technologies. Role Overview: As a Field Force Operations Senior Associate at Chryselys, you will leverage your expertise in commercial model design, sales force sizing, territory alignment, and deployment to optimize field force operations and processes. You will work closely with cross-functional teams, including client stakeholders and analytics experts, to define execution KPIs, maximize sales impact, and deliver actionable insights through advanced reporting and dashboards. Your role will also involve segmentation and targeting, incentive compensation processes, and planning for call activities and non-personal promotions. With hands-on experience in tools like Qlik, Power BI, and Tableau, along with technologies such as SQL, you will ensure impactful storytelling and effective stakeholder management while supporting clients across the U.S. and Europe. Key Responsibilities: Capabilities and experience in field force operations and processes related to commercial model design and structure, sales force sizing and optimization, Territory alignment and deployment Good understanding of commercial operations and analytics as a domain Expertise with SF/FF datasets for creating dashboards and reports for multiple user personas Ability to define FF execution and measurement KPIs to maximize sales impact Understanding and expertise in call activity planning, non-personal promotions Good knowledge of segmentation & targeting and incentive compensation processes Hands-on experience with tools like Qlik/Power BI/Tableau and technologies like Python/SQL Stakeholder management abilities and storytelling skills Experience in working with pharma clients across US and Europe What You Bring: · Education: Bachelor's or master's degree in data science, statistics, computer science, engineering, or a related field with a strong academic record. · Experience: 2-5 years of experience in field force operations, particularly in the pharmaceutical or healthcare industry, working with key datasets · Skills: § Strong experience with SQL and cloud-based data processing environments such as AWS (Redshift, Athena, S3) § Demonstrated ability to build data visualizations and communicate insights through tools like PowerBI, Tableau, Qlik, QuickSight, Javelin or similar. § Strong analytical skills, with experience in analogue analysis § Ability to manage multiple projects, prioritize tasks, and meet deadlines in a fast-paced environment. § Excellent communication and presentation skills, with the ability to explain complex data science concepts to non-technical stakeholders. § A strong problem-solving mindset, with the ability to adapt and innovate in a dynamic consulting environment. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

🚨 We’re Hiring | Senior Data Analyst 🚨 📍 Location: Trivandrum / Kochi / Remote 🕒 Notice Period: Immediate Joiners Only 💰 Budget: Up to ₹19 LPA 📊 Experience: 5+ Years 🌐 Preference: Keralites only Are you a data-driven problem solver with a passion for analytics and business intelligence? We’re on the lookout for a Senior Data Analyst to join our growing Data & Analytics team! ✅ Must-Have Skills 🔹 SQL , Power BI , and Python 🔹 Experience with Amazon Athena and relational databases like SQL Server, Redshift, or Snowflake 🔹 Knowledge of data modeling , ETL , and data architecture 🔹 Strong data storytelling and visualization capabilities 🔹 Excellent communication and stakeholder management skills 🧠 Key Responsibilities Analyze large and complex datasets to drive actionable insights Build compelling dashboards and reports in Power BI Collaborate with business and technical teams Maintain data quality, consistency, and accuracy Mentor and guide a team of data engineers 🔎 We are giving preference to candidates from Kerala who are ready to join immediately and thrive in a fast-paced, collaborative environment. #Hiring #SeniorDataAnalyst #PowerBI #SQL #Python #KeralitesPreferred #KeralaJobs #TrivandrumJobs #KochiJobs #ImmediateJoiners #RemoteJobs #DataAnalytics #BusinessIntelligence Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

This role is for one of the Weekday's clients Min Experience: 8 years Location: Bangalore, Mumbai JobType: full-time We are seeking a highly experienced and motivated Lead Data Engineer to join our data engineering team. This role is perfect for someone with 8-10 years of hands-on experience in designing and building scalable data infrastructure, data pipelines, and high-performance data platforms. You will lead a team of engineers, set data engineering standards, and work cross-functionally with data scientists, analysts, and software engineers to enable a data-driven culture within the organization. Requirements Key Responsibilities: Technical Leadership: Lead the design and development of robust, scalable, and high-performance data architectures, including batch and real-time data pipelines using modern technologies. Data Pipeline Development: Architect, implement, and maintain complex ETL/ELT workflows using tools like Apache Airflow, Spark, Kafka, or similar. Data Warehouse Management: Design and maintain cloud-based data warehouses and data lakes (e.g., Snowflake, Redshift, BigQuery, Delta Lake), ensuring optimized storage and query performance. Data Quality and Governance: Implement data validation, monitoring, and governance processes to ensure data accuracy, completeness, and security across all platforms. Collaboration: Work closely with stakeholders, including business analysts, data scientists, and application developers, to understand data needs and deliver effective solutions. Mentorship and Team Management: Guide and mentor junior and mid-level data engineers, foster best practices in code, architecture, and agile delivery. Automation and CI/CD: Develop and manage data pipeline deployment processes using DevOps and CI/CD principles. Required Skills & Qualifications: 8-10 years of proven experience in data engineering or a related field. Strong programming skills in Python, Scala, or Java. Expertise in building scalable and fault-tolerant ETL/ELT processes using frameworks such as Apache Spark, Kafka, Airflow, or similar. Hands-on experience with cloud platforms (AWS, GCP, or Azure) and tools like S3, Redshift, Snowflake, BigQuery, Glue, EMR, or Databricks. In-depth understanding of relational and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.). Strong SQL skills with the ability to write complex and optimized queries. Familiarity with data modeling, data warehousing concepts, and OLAP/OLTP systems. Experience in deploying data services using containerization (Docker, Kubernetes) and CI/CD tools like Jenkins, GitHub Actions, or similar. Excellent communication skills with a collaborative and proactive attitude. Preferred Qualifications: Experience working in fast-paced, agile environments or startups. Exposure to machine learning pipelines, MLOps, or real-time analytics. Familiarity with data governance frameworks and data privacy regulations (GDPR, CCPA) Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are seeking a highly experienced Senior Analyst to help guide us in our quest with our global, regional, and functional commercial policy implementation, reporting & governance projects. This successful candidate will contribute by building metrics, analyzing processes, workflows, and systems with the objective of identifying opportunities for either improvement or automation. Our ideal candidate is comfortable working with all levels of management to gain an in-depth understanding of our strategy and improving customer experience. This role requires close collaboration with product, segment partners, product marketing, customer to cash, sales, marketing, technology, and finance areas. This position resides in the Commercial Excellence organization and reports to the Manager of Commercial Policy Reporting & Governance. About The Role In this role as a Senior Analyst Commercial Policy Reporting & Governance, you will: Improve, execute, and effectively communicate significant analyses that identifies meaningful trends and opportunities across the business. Participate in regular meetings with stakeholders & management, assessing and addressing issues to identify and implement improvements toward efficient operations. Provide strong and timely business analytic support to business partners and various organizational stakeholders. Develop actionable road maps for improving workflows and processes. Effectively work with partners across the business to develop processes for capturing project activity, creating metrics driven dashboards for specific use cases, behaviors and evaluating the data for process improvement recommendations. Collaborate with Project Leads, Managers, and Business partners to determine schedules and project timelines ensuring alignments across all areas of the business. Drive commercial strategy and policy alignment with fast changing attributes, while managing reporting, tracking and governance best practices. Identify, assess, manage, and communicate risks while laying out mitigation plan and course corrections where appropriate. Provide insightful diagnostics and actionable insights to the leadership team in a proactive manner by spotting trends, questioning data and asking questions to understand underlying drivers. Proactively identify trends for future governance & reporting needs while presenting ideas to CE Leadership for new areas of opportunity to drive value. Prepare, analyze, and summarize various weekly, monthly, and periodic operational results for use by various key stakeholders, creating reports, specifications, instructions, and flowcharts. Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution. About You You’re a fit for the role of Senior Analyst Commercial Policy Reporting & Governance, if your background includes: Bachelor’s degree required, preferably in Computer Science, Mathematics, Business management, or economics. 4 to 6+ years of professional experience in a similar role. The role requires the candidate to work from 2 pm - 11 pm IST. Willing to work in hybrid mode, Work from Office Twice a week. Proven project management skills related planning and overseeing projects from the initial ideation through to completion. Proven ability to take complex and disparate data sets and create streamlined and efficient data lakes with connected and routinized cadence. Advanced level skills in the following systems: Power BI, Snowflake, Redshift, Salesforce.com, EDW, Excel, MS PowerPoint, and Alteryx/similar middleware data transformation tools. Familiarity with contract lifecycle management tools like Conga CLM, HighQ CLM etc. Ability to quickly draw insights into trends in data and make recommendations to drive productivity and efficiency. Exceptional verbal, written, and visual communication skills Experience managing multiple projects simultaneously within a matrix organization, adhering to deadlines in a fast-paced environment Ability to deploy influencing techniques to drive cross-functional alignment and change across broad audience Ability to be flexible with working hours to support ever-changing demands of the business What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Skill: Data Engineer Role: T3, T2 Key Responsibility Data Engineer Must have 5+ years of experience in below mentioned skills. Must Have: Big Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to Have: Event-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Location Hyderabad, Telangana, India Category Accounting / Finance Careers Job Id JREQ188357 Job Type Full time Hybrid We are seeking a highly experienced Senior Analyst to help guide us in our quest with our global, regional, and functional commercial policy implementation, reporting & governance projects. This successful candidate will contribute by building metrics, analyzing processes, workflows, and systems with the objective of identifying opportunities for either improvement or automation. Our ideal candidate is comfortable working with all levels of management to gain an in-depth understanding of our strategy and improving customer experience. This role requires close collaboration with product, segment partners, product marketing, customer to cash, sales, marketing, technology, and finance areas. This position resides in the Commercial Excellence organization and reports to the Manager of Commercial Policy Reporting & Governance. About the Role In this role as a Senior Analyst Commercial Policy Reporting & Governance, you will: Improve, execute, and effectively communicate significant analyses that identifies meaningful trends and opportunities across the business. Participate in regular meetings with stakeholders & management, assessing and addressing issues to identify and implement improvements toward efficient operations. Provide strong and timely business analytic support to business partners and various organizational stakeholders. Develop actionable road maps for improving workflows and processes. Effectively work with partners across the business to develop processes for capturing project activity, creating metrics driven dashboards for specific use cases, behaviors and evaluating the data for process improvement recommendations. Collaborate with Project Leads, Managers, and Business partners to determine schedules and project timelines ensuring alignments across all areas of the business. Drive commercial strategy and policy alignment with fast changing attributes, while managing reporting, tracking and governance best practices. Identify, assess, manage, and communicate risks while laying out mitigation plan and course corrections where appropriate. Provide insightful diagnostics and actionable insights to the leadership team in a proactive manner by spotting trends, questioning data and asking questions to understand underlying drivers. Proactively identify trends for future governance & reporting needs while presenting ideas to CE Leadership for new areas of opportunity to drive value. Prepare, analyze, and summarize various weekly, monthly, and periodic operational results for use by various key stakeholders, creating reports, specifications, instructions, and flowcharts. Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution. About You You’re a fit for the role of Senior Analyst Commercial Policy Reporting & Governance, if your background includes: Bachelor’s degree required, preferably in Computer Science, Mathematics, Business management, or economics. 4 to 6+ years of professional experience in a similar role. The role requires the candidate to work from 2 pm - 11 pm IST. Willing to work in hybrid mode, Work from Office Twice a week. Proven project management skills related planning and overseeing projects from the initial ideation through to completion. Proven ability to take complex and disparate data sets and create streamlined and efficient data lakes with connected and routinized cadence. Advanced level skills in the following systems: Power BI, Snowflake, Redshift, Salesforce.com, EDW, Excel, MS PowerPoint, and Alteryx/similar middleware data transformation tools. Familiarity with contract lifecycle management tools like Conga CLM, HighQ CLM etc. Ability to quickly draw insights into trends in data and make recommendations to drive productivity and efficiency. Exceptional verbal, written, and visual communication skills Experience managing multiple projects simultaneously within a matrix organization, adhering to deadlines in a fast-paced environment Ability to deploy influencing techniques to drive cross-functional alignment and change across broad audience Ability to be flexible with working hours to support ever-changing demands of the business #LI-GS2 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

Delhi, Delhi

On-site

Indeed logo

Full time | Work From Office This Position is Currently Open Department / Category: DEVELOPER Listed on Jun 03, 2025 Work Location: NEW DELHI Job Descritpion of Data Bricks Developer 7+ Years Relevant Experience More than 3 years in data integration, pipeline development, and data warehousing, with a strong focus on AWS Databricks. Job Responsibilities: Administer, manage, and optimize the Databricks environment to ensure efficient data processing and pipeline development Perform advanced troubleshooting, query optimization, and performance tuning in a Databricks environment Collaborate with development teams to guide, optimize, and refine data solutions within the Databricks ecosystem Ensure high performance in data handling and processing, including the optimization of Databricks jobs and clusters Engage with and support business teams to deliver data and analytics projects effectively Manage source control systems and utilize Jenkins for continuous integration Actively participate in the entire software development lifecycle, focusing on data integrity and efficiency within Databricks Technical Skills: Proficiency in Databricks platform, management, and optimization Strong experience in AWS Cloud, particularly in data engineering and administration, with expertise in Apache Spark, S3, Athena, Glue, Kafka, Lambda, Redshift, and RDS Proven experience in data engineering performance tuning and analytical understanding in business and program contexts Solid experience in Python development, specifically in PySpark within the AWS Cloud environment, including experience with Terraform Knowledge of databases (Oracle, SQL Server, PostgreSQL, Redshift, MySQL, or similar) and advanced database querying Experience with source control systems (Git, Bitbucket) and Jenkins for build and continuous integration Understanding of continuous deployment (CI/CD) processes Experience with Airflow and additional Apache Spark knowledge is advantageous Exposure to ETL tools, including Informatica Required Skills for Data Bricks Developer Job AWS Databricks Databases CI/CD Constrol systems Our Hiring Process Screening (HR Round) Technical Round 1 Technical Round 2 Final HR Round

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00002056 Information Technology Job Type Full-Time Posted Date 06/04/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job overview and responsibilities United Airlines is seeking talented people to join the Data Engineering Operations team. Key responsibilities include configuring and managing infrastructure, implementing continuous integration/continuous deployment (CI/CD) pipelines, and optimizing system performance. You will work to improve efficiency, enhance scalability, and ensure the reliability of systems through monitoring and proactive measures. Collaboration, scripting, and proficiency in tools for version control and automation are critical skills for success in this role. We are seeking creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights. Individuals who have a natural curiosity and desire to solve problems are encouraged to apply . Collaboration, scripting, and proficiency in tools for version control and automation are critical skills for success in this role. Translate product strategy and requirements into suitable, maintainable and scalable solution design according to existing architecture guardrails Collaborate with development and operations teams to understand project requirements and design effective DevOps solutions Implement and maintain CI/CD pipelines for automated software builds, testing, and deployment Manage and optimize cloud-based infrastructure to ensure scalability, security, and performance Implement and maintain monitoring and alerting systems for proactive issue resolution Work closely with cross-functional teams to troubleshoot and resolve infrastructure-related issues Automate repetitive tasks and processes to improve efficiency and reduce manual intervention Key Responsibilities: Design, deploy, and maintain cloud infrastructure on AWS. Set up and manage Kubernetes clusters for container orchestration. Design, implement, and manage scalable, secure, and highly available AWS infrastructure using Terraform. Develop and manage Infrastructure as Code (IaC) modules and reusable components. Collaborate with developers, architects, and other DevOps engineers to design cloud-native applications and deployment strategies. Manage and optimize CI/CD pipelines using tools like GitHub Actions, GitLab CI, Jenkins, or similar. Manage and optimize Databricks platform. Monitor infrastructure health and performance using AWS CloudWatch, Prometheus, Grafana, etc. Ensure cloud security best practices, including IAM policies, VPC configurations, data encryption, and secrets management. Create and manage networking infrastructure such as VPCs, subnets, security groups, route tables, NAT gateways, etc. Handle deployment and configuration of services such as EC2, RDS, Glue, S3, ECS/EKS, Lambda, API Gateway, Kinesis, MWAA, DynamoDB, CloudFront, Route 53, SQS,SNS, Athena, ELB/ALB. Maintain logging, alerting, and monitoring systems to ensure reliability and performance. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree in Computer Science, Engineering, or related field 5+ years of IT experience in Experience as a DevOps Engineer or in a similar role. Experience with AWS infrastructure designs, implementation, and support Proficiency in scripting languages (e.g., Bash, Python) and configuration management tools Experience with database systems like Postgress, Redshift, Mysql. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master’s in computer science or related STEM field Strong experience with continuous integration & delivery using Agile methodologies DevOps experience with transportation/airline industry Knowledge of security best practices in a DevOps environment Experience with logging and monitoring tools (e.g., Dynatrace / Datadog ) Strong problem-solving and communication skills Experience with Harness tools Experience with microservices architecture and serverless applications. Knowledge of database technologies (PostgreSQL, Redshift,Mysql). Knowledge of security best practices in a DevOps environment AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). Databricks Platform certifications.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines Amazon’s Consumer Payments organization is seeking a highly quantitative, experienced Data Engineer to drive growth through analytics, automation of data pipelines, and enhancement of self-serve experiences. . You will succeed in this role if you are an organized self-starter who can learn new technologies quickly and excel in a fast-paced environment. In this position, you will be a key contributor and sparring partner, developing analytics and insights that global executive management teams and business leaders will use to define global strategies and deep dive businesses. You will be part the team that is focused on acquiring new merchants from around the world to payments around the world. The position is based in India but will interact with global leaders and teams in Europe, Japan, US, and other regions. You should be highly analytical, resourceful, customer focused, team oriented, and have an ability to work independently under time constraints to meet deadlines. You will be comfortable thinking big and diving deep. A proven track record in taking on end-to-end ownership and successfully delivering results in a fast-paced, dynamic business environment is strongly preferred. Responsibilities include but not limited to: - Design, develop, implement, test, and operate large-scale, high-volume, high-performance data structures for analytics and Reporting. - Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, AWS – Redshift, and OLAP technologies, Model data and metadata for ad hoc and pre-built reporting. - Work with product tech teams and build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. - Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. - Interface with business customers, gathering requirements and delivering complete reporting solutions. - Collaborate with Analysts, Business Intelligence Engineers and Product Managers to implement algorithms that exploit rich data sets for statistical analysis, and machine learning. - Participate in strategic & tactical planning discussions, including annual budget processes. - Communicate effectively with product/business/tech-teams/other Data teams. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon's Last Mile Analytics & Quality (LMAQ) Maps team is building data-driven solutions to power the Last Mile delivery network that will serve hundreds of millions of customers worldwide. The Analytics team develops systems that model and optimize delivery operations through complex navigation and mapping datasets. The team specializes in processing and analyzing large-scale map and routing data across global markets. We work cross-functionally to seamlessly analyze and enhance last mile delivery network efficiency and service quality through sophisticated data processing pipelines. Our team is seeking a passionate and data-driven Business Analyst with experience in handling large-scale datasets to lead our efforts in enhancing driver experience and operational efficiency through advanced business analytics. This role is inherently cross-functional- you will work closely with engineering, operations, product teams and other stakeholders on last mile delivery challenges. Through close collaboration and by conducting analysis using statistical techniques and data visualizations, you will drive these challenges to resolution. The ideal candidate has a background in business analytics, experience with large-scale data processing, logistics understanding, project management skills, and a strong customer-centric approach to drive improvements in last-mile delivery. This job will require strong communication skills while having the ability to work independently in an evolving environment. Passion and drive for customer service is a must. Key job responsibilities Analyze complex business problems and develop data-driven solutions using SQL, Python, or R Handle and analyze large-scale navigation datasets, map datasets and map attributes Run and automate ETL jobs for processing and integrating large scale datasets Implement quality control measures for navigation and mapping data Develop dashboards and reports using tools like Tableau/PowerBI to track key performance metrics Perform statistical analysis and create predictive models Design and implement data quality checks and validation processes Collaborate with stakeholders to identify business needs and opportunities Lead process improvement initiatives Translate business requirements into technical specifications Present findings and recommendations to leadership Basic Qualifications Bachelor's degree or equivalent Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL 1+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience Preferred Qualifications Experience in Amazon Redshift and other AWS technologies Experience using databases with a large-scale data set Experience with reporting and Data Visualization tools such as Quick Sight / Tableau / Power BI or other BI packages Experience writing business requirements documents, functional specifications, and use cases Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ - H84 Job ID: A2998722 Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description When you attract people who have the DNA of pioneers and the DNA of explorers, you build a company of like-minded people who want to invent. And that’s what they think about when they get up in the morning: how are we going to work backwards from customers and build a great service or a great product” – Jeff Bezos Amazon.com’s success is built on a foundation of customer obsession. Have you ever thought about what it takes to successfully deliver millions of packages to Amazon customers seamlessly every day like a clock work? In order to make that happen, behind those millions of packages, billions of decision gets made by machines and humans. What is the accuracy of customer provided address? Do we know exact location of the address on Map? Is there a safe place? Can we make unattended delivery? Would signature be required? If the address is commercial property? Do we know open business hours of the address? What if customer is not home? Is there an alternate delivery address? Does customer have any special preference? What are other addresses that also have packages to be delivered on the same day? Are we optimizing delivery associate’s route? Does delivery associate know locality well enough? Is there an access code to get inside building? And the list simply goes on. At the core of all of it lies quality of underlying data that can help make those decisions in time. The person in this role will be a strong influencer who will ensure goal alignment with Technology, Operations, and Finance teams. This role will serve as the face of the organization to global stakeholders. This position requires a results-oriented, high-energy, dynamic individual with both stamina and mental quickness to be able to work and thrive in a fast-paced, high-growth global organization. Excellent communication skills and executive presence to get in front of VPs and SVPs across Amazon will be imperative. Key Strategic Objectives: Amazon is seeking an experienced leader to own the vision for quality improvement through global address management programs. As a Business Intelligence Engineer of Amazon last mile quality team, you will be responsible for shaping the strategy and direction of customer-facing products that are core to the customer experience. As a key member of the last mile leadership team, you will continually raise the bar on both quality and performance. You will bring innovation, a strategic perspective, a passionate voice, and an ability to prioritize and execute on a fast-moving set of priorities, competitive pressures, and operational initiatives. You will partner closely with product and technology teams to define and build innovative and delightful experiences for customers. You must be highly analytical, able to work extremely effectively in a matrix organization, and have the ability to break complex problems down into steps that drive product development at Amazon speed. You will set the tempo for defect reduction through continuous improvement and drive accountability across multiple business units in order to deliver large scale high visibility/ high impact projects. You will lead by example to be just as passionate about operational performance and predictability as you will be about all other aspects of customer experience. The Successful Candidate Will Be Able To Effectively manage customer expectations and resolve conflicts that balance client and company needs. Develop process to effectively maintain and disseminate project information to stakeholders. Be successful in a delivery focused environment and determining the right processes to make the team successful. This opportunity requires excellent technical, problem solving, and communication skills. The candidate is not just a policy maker/spokesperson but drives to get things done. Possess superior analytical abilities and judgment. Use quantitative and qualitative data to prioritize and influence, show creativity, experimentation and innovation, and drive projects with urgency in this fast-paced environment. Partner with key stakeholders to develop the vision and strategy for customer experience on our platforms. Influence product roadmaps based on this strategy along with your teams. Support the scalable growth of the company by developing and enabling the success of the Operations leadership team. Serve as a role model for Amazon Leadership Principles inside and outside the organization Actively seek to implement and distribute best practices across the operation Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2998446 Show more Show less

Posted 2 weeks ago

Apply

Exploring Redshift Jobs in India

The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.

Career Path

In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect

Related Skills

Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming

Interview Questions

  • What is Amazon Redshift and how does it differ from traditional databases? (basic)
  • How does data distribution work in Amazon Redshift? (medium)
  • Explain the difference between SORTKEY and DISTKEY in Redshift. (medium)
  • How do you optimize query performance in Amazon Redshift? (advanced)
  • What is the COPY command in Redshift used for? (basic)
  • How do you handle large data sets in Redshift? (medium)
  • Explain the concept of Redshift Spectrum. (advanced)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you monitor and manage Redshift clusters? (advanced)
  • Can you describe the architecture of Amazon Redshift? (medium)
  • What are the best practices for data loading in Redshift? (medium)
  • How do you handle concurrency in Redshift? (advanced)
  • Explain the concept of vacuuming in Redshift. (basic)
  • What are Redshift's limitations and how do you work around them? (advanced)
  • How do you scale Redshift clusters for performance? (medium)
  • What are the different node types available in Amazon Redshift? (basic)
  • How do you secure data in Amazon Redshift? (medium)
  • Explain the concept of Redshift Workload Management (WLM). (advanced)
  • What are the benefits of using Redshift over traditional data warehouses? (basic)
  • How do you optimize storage in Amazon Redshift? (medium)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you troubleshoot performance issues in Amazon Redshift? (advanced)
  • Can you explain the concept of columnar storage in Redshift? (basic)
  • How do you automate tasks in Redshift? (medium)
  • What are the different types of Redshift nodes and their use cases? (basic)

Conclusion

As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies