Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Chennai
On-site
Role: AWS Data Engineer Work timing: 2 pm to 11 pm Work Mode: Hybrid Work Location: Chennai & Hyderabad Primary Skills: Data Engineer & AWS Detailed JD Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS, RDS, Redshift, Cloud Formation and other AWS serverless resources Responsibility We are seeking a highly skilled Senior AWS Developer to join our team as a Senior Consultant. With a primary focus on Pega and SQL, the ideal candidate will also have experience with Agile methodologies. As a Senior AWS Developer, you will be responsible for optimizing data models for performance and efficiency, writing SQL queries to support data analysis and reporting, and designing and implementing ETL processes to load data into the data warehouse. You will also work with stakeholders to identify business needs and requirements for data-related projects and design and maintain the data architecture for all AWS data services. The ideal candidate will have at least 5 years of work experience and be comfortable working in a hybrid setting. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
0 years
0 Lacs
India
Remote
Job Title: Data Governance Specialist Job Description: A Data Governance Specialist will be responsible for several technical tasks, including developing and maintaining Collibra workflows, creating and maintaining integrations between data systems, data catalogs, and data quality tools. Additionally, other tasks related to managing metadata, master data, and business glossary will be required. Key Responsibilities: Develop and maintain Collibra workflows to support data governance initiatives. Create and maintain integrations between data systems and data governance tools. Write and maintain data quality rules to measure data quality. Work with vendors to troubleshoot and resolve technical issues related to workflows and integrations. Work with other teams to ensure adherence to DG policies and standards. Assist in implementing data governance initiatives around data quality, master data, and metadata management. Qualifications: Strong programming skills. Knowledge of system integration and use of middleware solutions. Proficiency in SQL and relational databases. Understanding of data governance, including data quality, master data, and metadata management. Willingness to learn new tools and skills. Preferred Qualifications: Proficient with Java or Groovy. Proficient with Mulesoft or other middleware. Proficient with Collibra DIP, Collibra Data Quality, and DQLabs. Experience with AWS Redshift, and Databricks. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Bangalore corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-centric Business analyst. Key job responsibilities This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL Experience with reporting and Data Visualization tools such as Quick Sight / Tableau / Power BI or other BI packages Analytical skills – has ability to start from ambiguous problem statements, identify and access relevant data, make appropriate assumptions, perform insightful analysis and draw conclusion relevant to the business problem. Preferred Qualifications Experience with data visualization using Tableau or similar tools Experienced in Python Exposure to ETL and AWS like Redshift, S3 etc. Expert-level proficiency in writing complex, highly-optimized SQL queries across large data sets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A2949159 Show more Show less
Posted 1 week ago
4.0 - 8.0 years
6 - 9 Lacs
Ahmedabad
On-site
Lead, Software Engineering Ahmedabad, India; Hyderabad, India Information Technology 315765 Job Description About The Role: Grade Level (for internal use): 11 The Team: The usage reporting team gathers raw usage data from disparate products and produces unified datasets across Market Intelligence departmental lines. We deliver essential intelligence for both public and internal reporting purposes. The Impact: As Lead Developer for the usage reporting team you will play a key role in delivering essential insights for both public and private users of the S&P Global Market Intelligence platforms. Our data provides the basis for strategy and insight that our team members depend on to deliver essential intelligence for our clients across the world. What’s in it for y ou: Work with a variety of subject matter experts to develop and improve data offerings Exposure to a wide variety of datasets and stakeholders when tackling daily challenges Oversee the complete SDLC pipeline from initial architecture, design, development, and support for data pipelines. Responsibilities: Produce technical design documents and conduct technical walkthroughs. Build and maintain data pipelines in T-SQL, Python, Java, Spark, and SSIS. Be part of an agile team that designs, develops, and maintains the enterprise data systems and other related software applications Participate in design sessions for new product features, data models, and capabilities Collaborate with key stakeholders to develop system architectures, API specifications, and implementation requirements. What We’re Looking For: 4-8 years of experience as a Senior Developer with strong experience in Python, Java, Spark, and T-SQL. 4-10 years of experience with public cloud platforms (AWS, GCP). Experience with frameworks such as Apache Spark, SSIS, Kafka, and Kubernetes. 4-10 years of data warehousing experience (Redshift, SSAS Cube, BigQuery). Strong self-starter and independent self-motivated software engineer. Strong leadership skills and proven ability to collaborate effectively with engineering leadership and key stakeholders. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315765 Posted On: 2025-05-29 Location: Ahmedabad, Gujarat, India
Posted 1 week ago
3.0 years
0 Lacs
Andhra Pradesh
On-site
Software Engineering Associate Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Summary: Evernorth a leading Health Services company, is looking for exceptional software engineers/developers in our Data & Analytics organization. The Full Stack Engineer is responsible for the delivery of a business need end-to-end starting from understanding the requirements, developing the solution to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is Ownership & Accountability. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. He/She should drive the adoption of CI/CD tools and support the improvement of the tools sets / processes. Behaviors of a Full Stack Engineer: Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers - not institutionalized developers. Key Characteristics: Write referenceable & modular code and have experience in all the phases of SDLC Hands on experience developing or using CI/CD solutions for application. Be inquisitive, analytical, and investigative, all while keeping customer in mind Be a planner -time management and the ability to see around the corner is essential Take ownership and accountability and have a desire to simplify, automate Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact Qualifications: 3+ years being part of Agile teams - Scrum or Kanban 5+ years of hands-on experience in programming language Python. (Python is mandatory) 2+ years in Web development frameworks like React, Angular, other Node.js based systems, Django, React-native, etc. 4+ years of REST APIs 4+ years of scripting (Python, Shell etc.) 4+ years of experience with any. Enterprise level database (Postgres, Db2, TDV, Redshift, My SQL or any other cloud DB) Experience with GitHub / Jenkins Excellent Analytical & troubleshooting skills Strong communication skills Fluent in BDD (Behavior Driven Development) and TDD (Test-driven development) methodologies Experience of AWS /Azure cloud technologies, terraform, CloudFormation is a plus Location & Hours of Work Hyderabad /General Shift (11:30 AM - 8:30 PM IST / 1:00 AM - 10:00 AM EST / 2:00 AM - 11:00 AM EDT) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary The Senior Data Engineer leads complex data engineering projects working on designing data architectures that align with business requirements This role focuses on optimizing data workflows managing data pipelines and ensuring the smooth operation of data systems Minimum Qualifications 8 Years overall IT experience with minimum 5 years of work experience in below tech skills Tech Skill Strong experience in Python Scripting and PySpark for data processing Proficiency in SQL dealing with big data over Informatica ETL Proven experience in Data quality and data optimization of data lake in Iceberg format with strong understanding of architecture Experience in AWS Glue jobs Experience in AWS cloud platform and its data services S3 Redshift Lambda EMR Airflow Postgres SNS Event bridge Expertise in BASH Shell scripting Strong understanding of healthcare data systems and experience leading data engineering teams Experience in Agile environments Excellent problem solving skills and attention to detail Effective communication and collaboration skills Responsibilities Leads development of data pipelines and architectures that handle large scale data sets Designs constructs and tests data architecture aligned with business requirements Provides technical leadership for data projects ensuring best practices and high quality data solutions Collaborates with product finance and other business units to ensure data pipelines meet business requirements Work with DBT Data Build Tool for transforming raw data into actionable insights Oversees development of data solutions that enable predictive and prescriptive analytics Ensures the technical quality of solutions managing data as it moves across environments Aligns data architecture to Healthfirst solution architecture Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Data Engineer - SQL, Python Location: Chennai (Work from Office - 5 Days a Week) Experience: 7+ Years Employment Type: Full-time About the role: We are looking for a Data Engineer experienced in SQL, DBT, and data modeling with familiarity in legacy systems. This role requires a hands-on builder with an eye for optimization and system reliability. Key Responsibilities: Develop data pipelines using SQL and DBT for data transformation and integration. Migrate and modernize data from legacy systems like SQL Server, Teradata, HANA, or Hadoop. Collaborate with cross-functional teams to understand data needs and deliver effective solutions. Monitor and optimize data flow performance. Maintain data quality, governance, and version control across environments. Requirements: 7+ years of experience in data engineering with SQL and Dbt core. Hands-on experience with at least one legacy platform (SQL Server, Teradata, HANA, or Hadoop). Strong understanding of data modeling concepts (dimensional, star/snowflake schema). Experience or familiarity with Click House is a huge plus, Supersets, Tableau. Excellent coding, debugging, and problem-solving skills Bonus points: Experience with cloud data warehouses (Snowflake, Big Query, Redshift). Immediate joiners are preferred and eligible for a joining bonus. Interested candidates please share your updated resume on anamika@enroutecorp.in Show more Show less
Posted 1 week ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Manager, Data Visualization, Power Platform Solutions The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview As a Manager of the Microsoft Power Platform, you will focus on designing and developing comprehensive solutions across Power Apps, Power BI, Power Automate, and Power Virtual Agents to drive actionable insights and facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in creating user-centric applications, dashboards, automated workflows, and chatbots that empower stakeholders with data-driven insights and support informed decision-making. Our Quantitative Sciences team uses big data to evaluate the safety, and efficacy claims of potential medical breakthroughs. We leverage deep scientific knowledge, rigorous statistical analysis, and high-quality data to assess the quality and reliability of clinical studies, thereby supporting data-driven decision-making in clinical trials. What Will You Do In This Role Design and develop user-centric solutions utilizing the full suite of Power Platform tools (Power BI, Power Apps, Power Automate) with complex data sources. Identify and define key business metrics and KPIs in partnership with business stakeholders. Define and develop scalable data models with alignment and support from data engineering and IT teams. Lead UI/UX workshops to develop user stories, wireframes, and intuitive applications and workflows. Collaborate with data engineering, data science, and IT teams to deliver comprehensive business solutions including dashboards, applications, and automated workflows. Apply best practices in solution design across Power Platform components and continuously improve user experience for business stakeholders. Provide thought leadership on Power Platform best practices to the broader Data & Analytics organization. Identify opportunities to apply Power Platform technologies to streamline and enhance manual/legacy processes and reporting deliveries. Provide training and coaching to internal stakeholders to enable a self-service operating model across the Power Platform. Co-create information governance and apply data privacy best practices to Power Platform solutions. Continuously innovate on best practices and technologies within the Power Platform by reviewing external resources and marketplace trends. What Should You Have 5 years of relevant experience in designing and implementing solutions using the Power Platform tools. Experience and knowledge in Power Platform technologies including Power BI, Power Apps, and Power Automate. Experience with other data visualization technologies such as QLIK, Spotfire, and Tableau is a plus. Experience and knowledge in ETL processes, data modeling techniques, and platforms (Alteryx, Informatica, Dataiku, etc.). Experience working with database technologies (Redshift, Oracle, Snowflake, etc.) and data processing languages (SQL, Python, R, etc.). Experience in leveraging and managing third-party vendors and contractors. Self-motivation, proactivity, and the ability to work independently with minimal direction. Excellent interpersonal and communication skills. Excellent organizational skills, with the ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate with and lead diverse groups of colleagues and positively manage ambiguity. Experience in the Pharma and/or Biotech industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 06/15/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R342329 Show more Show less
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
India
On-site
Precisely is the leader in data integrity. We empower businesses to make more confident decisions based on trusted data through a unique combination of software, data enrichment products and strategic services. What does this mean to you? For starters, it means joining a company focused on delivering outstanding innovation and support that helps customers increase revenue, lower costs and reduce risk. In fact, Precisely powers better decisions for more than 12,000 global organizations, including 93 of the Fortune 100. Precisely's 2500 employees are unified by four company core values that are central to who we are and how we operate: Openness, Determination, Individuality, and Collaboration. We are committed to career development for our employees and offer opportunities for growth, learning and building community. With a "work from anywhere" culture, we celebrate diversity in a distributed environment with a presence in 30 countries as well as 20 offices in over 5 continents. Learn more about why it's an exciting time to join Precisely! Intro And Job Overview As a Senior Software Engineer II, you will join a team working with next gen technologies on geospatial solutions in order to identify areas for future growth, new customers and new markets in the Geocoding & data integrity space. You will be working on the distributed computing platform in order to migrate existing geospatial datasets creation process-and bring more value to Precisely’s customers and grow market share. Responsibilities And Duties You wil be working on the distributing computing platform to migrate the existing Geospatial data processes including sql scripts,groovy scripts. Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groove/Gradle/maven You will be working closely with Domain/Technical experts and drive the overall modernization of the existing processes. You will be responsible to drive and maintain the AWS infrastructures and other Devops processes. Participate in design and code reviews within a team environment to eliminate errors early in the development process. Participate in problem determination and debugging of software product issues by using technical skills and tools to isolate the cause of the problem in an efficient and timely manner. Provide documentation needed to thoroughly communicate software functionality. Present technical features of product to customers and stakeholders as required. Ensure timelines and deliverables are met. Participate in the Agile development process. Requirements And Qualifications UG - B.Tech/B.E. OR PG – M.S. / M.Tech in Computer Science, Engineering or related discipline At least 5-7 years of experience implementing and managing geospatial solutions Expert level in programming language Java, Python. Groovy experience is preferred. Expert level in writing optimized SQL queries, procedures, or database objects to support data extraction, manipulation in data environment Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groovy/Gradle/maven Expert in script automation in Gradle and Maven. Problem Solving and Troubleshooting – Proven ability to analyze and solve complex data problems, troubleshoot data pipelines issues effectively Experience in SQL, database warehouse and data engineering concepts Experience with AWS platform provided Big Data technologies (IAM, EC2, S3, EMR, RedShift, Lambda, Aurora, SNS, etc.) Strong analytical, problem-solving, data analysis and research Good knowledge of Continuous Build Integration (Jenkins and Gitlab pipeline) Experience with agile development and working with agile engineering teams Excellent interpersonal skills Knowledge on micro services and cloud native framework. Knowledge of Machine Learning / AI. Knowledge on programming language Python. The personal data that you provide as a part of this job application will be handled in accordance with relevant laws. For more information about how Precisely handles the personal data of job applicants, please see the Precisely Global Applicant and Candidate Privacy Notice. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary We are looking for an experienced and proactive ETL Lead to oversee and guide our ETL testing and data validation efforts. This role requires a deep understanding of ETL processes, strong technical expertise in tools such as SQL, Oracle, MongoDB, AWS, and Python/Pyspark, and proven leadership capabilities. The ETL Lead will be responsible for ensuring the quality, accuracy, and performance of our data pipelines while mentoring a team of testers and collaborating with cross-functional stakeholders. Job Description Key Responsibilities: Lead the planning, design, and execution of ETL testing strategies across multiple projects. Oversee the development and maintenance of test plans, test cases, and test data for ETL processes. Ensure data integrity, consistency, and accuracy across all data sources and destinations. Collaborate with data engineers, developers, business analysts, and project managers to define ETL requirements and testing scope. Mentor and guide a team of ETL testers, providing technical direction and support. Review and approve test deliverables and ensure adherence to best practices and quality standards. Identify and resolve complex data issues, bottlenecks, and performance challenges. Drive continuous improvement in ETL testing processes, tools, and methodologies. Provide regular status updates, test metrics, and risk assessments to stakeholders. Stay current with emerging trends and technologies in data engineering and ETL testing. Requirements 6+ years of experience in ETL testing, with at least 2 years in a lead or senior role. Strong expertise in ETL concepts, data warehousing, and data validation techniques. Hands-on experience with Oracle, MongoDB, AWS services (e.g., S3, Redshift, Glue), and Python/Pyspark scripting. Advanced proficiency in SQL and other query languages. Proven ability to lead and mentor a team of testers. Excellent problem-solving, analytical, and debugging skills. Strong communication and stakeholder management abilities. Experience with Agile/Scrum methodologies is a plus. Ability to manage multiple priorities and deliver high-quality results under tight deadlines. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are looking for a Data Engineer SDE-3 who can take ownership of designing, developing, and maintaining scalable and reliable data pipelines. You will play a critical role in shaping the data infrastructure that powers business insights, product intelligence, and scalable learning platforms at PW. Roles Open: Data Engineer SDE-2 and Data Engineer SDE-3 Location: Noida & Bangalore Key Responsibilities: Design and implement scalable, efficient ETL/ELT pipelines to ingest, transform, and process data from multiple sources. Architect and maintain robust data lake and warehouse solutions, aligning with business and analytical needs. Own the development and optimization of distributed data processing systems using Spark, AWS EMR, or similar technologies. Collaborate with cross-functional teams (data science, analytics, product) to gather requirements and implement data-driven solutions. Ensure high levels of data quality, security, and availability across systems. Evaluate emerging technologies and tools for data processing and workflow orchestration. Build reusable components, libraries, and frameworks to enhance engineering efficiency and reliability. Drive performance tuning, cost optimization, and automation of data infrastructure. Mentor junior engineers, review code, and set standards for development practices. Required Skills & Qualifications: 5+ years of professional experience in data engineering or backend systems with a focus on scalable systems. Strong hands-on experience with Python or Scala , and writing efficient, production-grade code. Deep understanding of data engineering concepts: data modeling, data warehousing, data lakes, streaming vs. batch processing, and metadata management. Solid experience with AWS (S3, Redshift, EMR, Glue, Lambda) or equivalent cloud platforms. Experience working with orchestration tools like Apache Airflow (preferred) or similar (Azkaban, Luigi). Proven expertise in working with big data tools such as Apache Spark , and managing Kubernetes clusters. Proficient in SQL and working with both relational (Postgres, Redshift) and NoSQL (MongoDB) databases. Ability to understand API-driven architecture and integrate with backend services as part of data pipelines. Strong problem-solving skills, with a proactive attitude towards ownership and continuous improvement. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Are you ready for the next step in your engineering career? Would you enjoy working on our cutting-edge products? About The Team The Marketing Data Hub team is responsible for managing the organization’s marketing data systems, ensuring seamless integration, scalability, and accessibility of marketing data across the enterprise. The team enables strategic decision-making by delivering reliable, optimized, and actionable data solutions for marketing initiatives, campaigns, and performance analysis. About The Role This role is designed for a skilled software engineer with expertise in database management and development, focusing on Amazon Redshift and data optimization within a marketing data hub system. In addition to Redshift, familiarity with other database platforms (e.g., PostgreSQL, MySQL, Oracle, or SQL Server) is beneficial. You will design and implement robust data pipelines, optimize query performance, and support data migration tasks while ensuring data accuracy and consistency. You will collaborate closely with stakeholders, work on cross-functional teams, and contribute to the continuous improvement of data systems. Responsibilities Develop, enhance, and maintain database solutions for the Marketing Data Hub using Amazon Redshift as the primary platform. Design and implement optimized data pipelines and ETL processes to support marketing data workflows. Write and optimize complex SQL queries for data manipulation, reporting, and analytics. Collaborate with stakeholders to understand business requirements and translate them into scalable database solutions. Support data migration efforts and ensure smooth data integration between various systems, including Redshift and other databases like PostgreSQL, MySQL, or Oracle. Monitor and troubleshoot database performance, identifying bottlenecks and implementing improvements. Ensure data quality, accuracy, and integrity through rigorous validation processes. Document database schemas, configurations, and processes to ensure transparency and maintain compliance. Stay updated with industry trends and recommend improvements to database technologies and practices. Requirements Minimum two years of experience in database design, development, and management with a strong focus on Amazon Redshift. Advanced SQL skills with the ability to write and optimize complex queries. Experience designing and managing data pipelines and ETL processes. Solid understanding of data modeling, data warehousing, and performance tuning in Redshift. Familiarity with additional database platforms such as PostgreSQL, MySQL, Oracle, or SQL Server. Knowledge of data migration and integration strategies across different database technologies. Familiarity with cloud-based solutions and platforms (AWS preferred). Experience working in collaborative Agile environments and handling production support tasks. Strong problem-solving and troubleshooting skills with a keen attention to detail. Excellent communication skills to collaborate with internal teams and stakeholders effectively. Familiarity with AWS services such as S3, Lambda, and Glue. Experience with Python or other scripting languages for automation and database scripting. Knowledge of marketing systems or data workflows in marketing environments. Understanding of data visualization tools (e.g., Tableau, Power BI) for reporting purposes. Bachelor’s degree in computer science, Information Systems, or a related field. Proven experience working with Amazon Redshift and other large-scale database systems. Demonstrated ability to optimize database systems and improve performance. Work in a Way that Works for You We promote a healthy work/life balance with numerous wellbeing initiatives, shared parental leave, study assistance, and sabbaticals to help you meet your immediate responsibilities and long-term goals. Working for You Comprehensive Health Insurance for you, your immediate family, and parents. Enhanced Health Insurance Options at competitive rates. Group Life Insurance for Financial Security. Group Accident Insurance for extra protection. Flexible Working Arrangement for a harmonious work-life balance. Employee Assistance Program for personal and work-related support. Medical Screening for your well-being. Modern Family Benefits include maternity, paternity, and adoption support. Long Service Awards recognize dedication and commitment. New Baby Gift celebrating parenthood. Various Paid Time Off options including Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. About Business LexisNexis Legal & Professional® provides legal, regulatory, and business information and analytics that help customers increase their productivity, improve decision-making, achieve better outcomes, and advance the rule of law around the world. As a digital pioneer, the company was the first to bring legal and business information online with its Lexis® and Nexis® services. Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
LotusFlare is a provider of cloud-native SaaS products based in the heart of Silicon Valley. Founded by the team that helped Facebook reach over one billion users, LotusFlare was founded to make affordable mobile communications available to everyone on Earth. Today, LotusFlare focuses on designing, building, and continuously evolving a digital commerce and monetization platform that delivers valuable outcomes for enterprises. Our platform, Digital Network Operator® (DNO™) Cloud, is licensed to telecommunications services providers and supports millions of customers globally. LotusFlare has also designed and built the leading eSIM travel product - Nomad. Nomad provides global travelers with high-speed, affordable data connectivity in over 190 countries. Nomad is available as an iOS or Android app or via getnomad.app. Description: DevOps Support Engineers at LotusFlare guarantee the quality of the software solutions produced by LotusFlare by monitoring, responding to incidents and testing and quality checking. With shared accountability and code ownership, DevOps Support Engineers take on-call responsibilities and incident management work. Through these activities, LoutsFlare developers write code that better fits into their applications and infrastructure, helping them proactively deepen the reliability of services being deployed. DevOps Support Engineers will test the functionality of the code to bring out every flaw and to improve on the underperformance of every standalone feature. This role is always on the lookout for opportunities to improve any and every feature to bring customer satisfaction. Partnered alongside the best engineers in the industry on the coolest stuff around, the code and systems you work on will be in production and used by millions of users all around the world. Our team comprises engineers with varying levels of experience and backgrounds, from new grads to industry veterans. Relevant industry experience is important (Site Reliability Engineer (SRE), Systems Engineer, Software Engineer, DevOps Engineer, Network Engineer, Systems Administrator, Linux Administrator, Database Administrator, or similar role), but ultimately less so than your demonstrated abilities and attitude. Responsibilities: Monitoring backend services (cloud-based infrastructure) Supporting, troubleshooting, and investigating issues and incidents (support developers and infra team with system metrics analysis, logs, traffic, configuration, deployment changes, etc) Supporting and improving monitoring/alerting systems (Searching, testing, deploying new functionality for existing tools) Creating new features for automating troubleshooting and investigation process Creating new tools to improve the support process Drafting reports and summarizing information after investigations and incidents Requirements: At least 1 year of work experience with similar responsibilities Strong knowledge and practical experience in working with the Linux(Ubuntu) command-line/administration Understanding of network protocols and troubleshooting (TCP/IP, UDP) Strong scripting skills (Bash, Python) Critical thinking and problem solving Understanding of containerization (Docker, container) Experience with troubleshooting API driven services Experience with Kubernetes Experience with Git Background in release management processes English — Professional written and verbal skills Good to have: Prometheus, Grafana, Kibana (Query language) Experience with Nginx/OpenResty Experience with telco protocols (Camel, Map, Diameter) from advantage Software development/scripting skills Basic knowledge Casandra, PostgreSQL Experience with using AWS cloud services (EC2, Redshift, S3, RDS, ELB/ALB, ElastiCache, Direct Connect, Route 53, Elastic IPs, etc.) CI/CD: Jenkins Terraform Recruitment Process: HR Interview followed by 4-5 Levels of Technical Interviews About: At LotusFlare, we attract and keep amazing people by offering two key things: Purposeful Work: Every team member sees how their efforts make a tangible, positive difference for our customers and partners Growth Opportunities: We provide the chance to develop professionally while mastering cutting-edge practices in cloud-native enterprise software From the beginning, our mission has been to simplify technology to create better experiences for customers. Using an “experience down” approach, which prioritizes the customer's journey at every stage of development, our Digital Network Operator™ Cloud empowers communication service providers to achieve valuable business outcomes. DNO Cloud enables communication service providers to innovate freely, reduce operational costs, monetize network assets, engage customers on all digital channels, drive customer acquisition, and increase retention. With headquarters in Santa Clara, California, and five major offices worldwide, LotusFlare serves Deutsche Telekom, T-Mobile, A1, Globe Telecom, Liberty Latin America, Singtel, and other leading enterprises around the world. Website: www.lotusflare.com LinkedIn: https://www.linkedin.com/company/lotusflare Instagram: https://www.instagram.com/lifeatlotusflare/ Twitter: https://twitter.com/lotus_flare Powered by JazzHR tbrMlm09eW Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a Data Engineer, you will play a crucial role in designing, building, and maintaining the data infrastructure and systems required for efficient and reliable data processing. You will collaborate with cross-functional teams, including data scientists, analysts, to ensure the availability, integrity, and accessibility of data for various business needs. This role requires a strong understanding of data management principles, database technologies, data integration, and data warehousing concepts. Key Responsibilities Develop and maintain data warehouse solutions, including data modeling, schema design, and indexing strategies Optimize data processing workflows for improved performance, reliability, and scalability Identify and integrate diverse data sources, both internal and external, into a centralized data platform Implement and manage data lakes, data marts, or other storage solutions as required Ensure data privacy and compliance with relevant data protection regulations Define and implement data governance policies, standards, and best practices Transform raw data into usable formats for analytics, reporting, and machine learning purposes Perform data cleansing, normalization, aggregation, and enrichment operations to enhance data quality and usability Collaborate with data analysts and data scientists to understand data requirements and implement appropriate data transformations What You'll Bring Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field Proficiency in SQL and experience with relational databases (e.g., Snowflake, MySQL, PostgreSQL, Oracle) 3+ years of experience in data engineering or a similar role Hands-on programming skills in languages such as Python or Java is a plus Familiarity with cloud-based data platforms (e.g., AWS, Azure, GCP) and related services (e.g., S3, Redshift, BigQuery) is good to have Knowledge of data modeling and database design principles Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus Strong problem-solving and analytical skills with attention to detail Experience with HR data analysis and HR domain knowledge is preferred Who You'll Work With As part of the People Analytics team, you will modernize HR platforms, capabilities & engagement, automate/digitize core HR processes and operations and enable greater efficiency. You will collaborate with the global people team and colleagues across BCG to manage the life cycle of all BCG employees. The People Management Team (PMT) is comprised of several centers of expertise including HR Operations, People Analytics, Career Development, Learning & Development, Talent Acquisition & Branding, Compensation, and Mobility. Our centers of expertise work together to build out new teams and capabilities by sourcing, acquiring and retaining the best, diverse talent for BCG’s Global Services Business. We develop talent and capabilities, while enhancing managers’ effectiveness, and building affiliation and engagement in our new global offices. The PMT also harmonizes process efficiencies, automation, and global standardization. Through analytics and digitalization, we are always looking to expand our PMT capabilities and coverage Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer – C10/Officer (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Business Technology Analyst – Global Employer Services Technology Center Deloitte Tax Services India Private Limited (“Deloitte Tax in India”) commenced operations in June 2004. Since then, nearly all of the Deloitte Tax LLP (“Deloitte Tax”) U.S. service lines and regions have obtained support services through Deloitte Tax in India. We provide support through the tax transformation taking place in the marketplace. We offer a broad range of fully integrated tax services by combining technology and tax technical resources to uncover insights and smarter solutions for navigating an increasingly complex global environment. We provide opportunities to transform tax operations using contemporary technologies in the market. Individuals work to transform their current state of tax to the next generation of tax functions. Are you ready to take the next step in your career to find new methods and processes to assist clients in improving their tax operations using new technologies? If the answer is “Yes,” come join Global Employer Services Technology Center (GESTC) The Team Organizations today are faced with an increasingly complex global talent landscape. The workforce is more agile, diversified and on demand, leading organizations to re-evaluate their talent models and how they deploy teams globally. An ever-changing geo-political landscape and new tax digital strategies create opportunities for Deloitte to ensure we provide innovative solutions to keep our clients compliant. Global Employer Services (GES) is a market leading ~USD 1.3 billion business with a prestigious client portfolio delivering mobility, reward and compliance services enabled through technology solutions. We are offering a unique opportunity to join our GES Technology team of ~200 professionals worldwide. This high performing, successful team creates innovative new technology products to enable GES services where you will have the platform to drive, influence and contribute to the success of our business. Job purpose: The Data Analytics application Developer (SQL, SSIS) is responsible for partnering with Customers and the teams that achieve the goals of clients/customers. You will be working with cutting edge technology, database and visualization via dashboards. The important skills for this position are SQL, Microsoft SSIS, data extraction, data modeling, data transformation, and DBA skills. The successful candidate will have a high level of attention to detail, the ability to execute and deliver project deliverables on budget and on time, and multi-task in a dynamic environment. This position requires significant customer contact and you must possess excellent communication, consulting, critical thinking, quantitative analysis and probing skills to effectively manage client expectations. Applicants should be able to function in a close team environment and communicate within the team. Key job responsibilities: Developing and maintaining reporting and analytical tools, including dashboards Working with several large, complex SQL databases Experience working in SSRS and writing complex stored procedures Knowledge of Bold reports will be advantageous Experience working on Redshift and Aurora will be beneficial Wrangling data from multiple sources create integrated views that can be used to drive decision making Participating in the design and execution of qualitative or quantitative analyses to help clients with relevant insights Partnering with the technology teams to deliver a robust reporting platform Working with business owners to identify information needs and develop reports/dashboards Performing unit and system level testing on applications Education/Background: BTech/BSc in computer science or information technology Key skills desired 2 to 3 experience working on SSRS Strong knowledge of relational databases such as SQL Server, Oracle Good to have knowledge on any analytics tool (QlikView, QlikSense, Tableau) Knowledge of HTML, XML, JSON, Postman, REST API, MS Excel is a plus. Ability to develop large scale web/database applications Ability to simultaneously work on multiple projects effectively Ability to communicate clearly with business users and project manager Ability to innovate and provide functional applications with intuitive interfaces Ability to interact with individuals at all levels of the organization Ability to share knowledge and work effectively in a team Consistently meet client expectations and project deadlines Good interpersonal, organizational skills Strong commitment to client service excellence Work Location: Hyderabad Shift Timings: 11:00 AM to 8:00 PM || 2:00 PM to 11:00 PM #CA-GSD #CA-HPN Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304048 Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potentiaL Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte’s practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice. Role Purpose Our purpose in the B&PB Data Engineering team is to develop and deliver best-in-class data assets and manage data domains for our Business and Private Banking customers and colleagues—seamlessly and reliably every time. We are passionate about simplicity and meeting the needs of our stakeholders, while continuously innovating to drive value. As a Data Engineer , you will bring strong expertise in data handling and curation to the team. You will be responsible for building reusable datasets and improving the way we work within BPB. This role requires a strong team player mindset and a focus on contributing to the team’s overall success. Job Title: Analyst Data Engineer Division: Business and Private Banking (BPB) Team Name: Data and Analytics, Data Analytics & Strategy Execution Reporting to (People Leader Position): Manager, Data Engineering Location: Gurgaon Core Responsibilities Manage ETL jobs and ensure data requirements from BPB reporting and business teams are met. Assist with operational data loads and support ongoing data ingestion processes. Translate business requirements into technical specifications. Work alongside senior data engineers to deliver scalable, efficient data solutions. Key Role Responsibilities Actively participate in the development, testing, deployment, monitoring, and refinement of data services. Manage and resolve incidents/problems; apply fixes and resolve systematic issues. Collaborate with stakeholders to triage issues and implement solutions that restore productivity. Risk Proactively manage risk in accordance with all policy and compliance requirements. Perform appropriate controls and adhere to all relevant processes and procedures. Promptly escalate any events, issues, or breaches as they are identified. Understand and own the risk responsibilities associated with the role. Accountabilities Build effective working relationships with BPB teams to ensure alignment with the overall Data Analytics Strategy. Deliver ETL pipelines that meet business reporting and data needs. Orchestrate and automate data workflows to ensure timely and reliable dataset delivery. Translate business goals into technical data engineering requirements. People Accountability People Accountability: Individual Contributor Number of Direct Reports: 0 Essential Capabilities Individuals with a minimum of 1–2 years of experience in a similar data engineering or technical role. A tertiary qualification in Computer Science or a related discipline. Critical thinkers who use networks, knowledge, and data to drive better outcomes for the business and customers. Continuous improvers who challenge the status quo and advocate for better solutions. Team players who value diverse skills and perspectives. Customer-focused individuals who define problems and develop solutions based on stakeholder needs. Required Technical Skills: Experience with design, build, and implementation of data engineering pipelines using: SQL Python Airflow Databricks (or Snowflake) Experience with cloud-based data solutions (preferably AWS). Familiarity with on-premises data environments such as Oracle. Strong development and performance tuning skills with RDBMS platforms including: Oracle Teradata Snowflake Redshift Our purpose Deloitte is led by a purpose: To make an impact tha t matters. Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Business Technology Analyst – Global Employer Services Technology Center Deloitte Tax Services India Private Limited (“Deloitte Tax in India”) commenced operations in June 2004. Since then, nearly all of the Deloitte Tax LLP (“Deloitte Tax”) U.S. service lines and regions have obtained support services through Deloitte Tax in India. We provide support through the tax transformation taking place in the marketplace. We offer a broad range of fully integrated tax services by combining technology and tax technical resources to uncover insights and smarter solutions for navigating an increasingly complex global environment. We provide opportunities to transform tax operations using contemporary technologies in the market. Individuals work to transform their current state of tax to the next generation of tax functions. Are you ready to take the next step in your career to find new methods and processes to assist clients in improving their tax operations using new technologies? If the answer is “Yes,” come join Global Employer Services Technology Center (GESTC) The Team Organizations today are faced with an increasingly complex global talent landscape. The workforce is more agile, diversified and on demand, leading organizations to re-evaluate their talent models and how they deploy teams globally. An ever-changing geo-political landscape and new tax digital strategies create opportunities for Deloitte to ensure we provide innovative solutions to keep our clients compliant. Global Employer Services (GES) is a market leading ~USD 1.3 billion business with a prestigious client portfolio delivering mobility, reward and compliance services enabled through technology solutions. We are offering a unique opportunity to join our GES Technology team of ~200 professionals worldwide. This high performing, successful team creates innovative new technology products to enable GES services where you will have the platform to drive, influence and contribute to the success of our business. Job purpose: The Data Analytics application Developer (SQL, SSIS) is responsible for partnering with Customers and the teams that achieve the goals of clients/customers. You will be working with cutting edge technology, database, and visualization via dashboards. The important skills for this position are SQL, Microsoft SSIS, data extraction, data modeling, data transformation, and DBA skills. The successful candidate will have a high level of attention to detail, the ability to execute and deliver project deliverables on budget and on time, and multi-task in a dynamic environment. This position requires significant customer contact, and you must possess excellent communication, consulting, critical thinking, quantitative analysis, and probing skills to effectively manage client expectations. Applicants should be able to function in a close team environment and communicate within the team. You will also be responsible for managing a team, assigning work, and reporting work status back to the Product team. Key job responsibilities: Developing and maintaining reporting and analytical tools, including dashboards Working with several large, complex SQL databases Experience working in SSRS and writing complex stored procedures Knowledge of Bold reports will be advantageous Experience working on Redshift and Aurora will be beneficial Wrangling data from multiple sources create integrated views that can be used to drive decision making Participating in the design and execution of qualitative or quantitative analyses to help clients with relevant insights Partnering with the technology teams to deliver a robust reporting platform Working with business owners to identify information needs and develop reports/dashboards Performing unit and system level testing on applications Experience managing a team Setting tasks for the team Reporting work status to Product team Reviewing reports developed by other team members Education/Background: BTech/BSc in computer science or information technology Key skills desired 3 to 5 experience working on SSRS Strong knowledge of relational databases such as SQL Server, Oracle Good to have knowledge on any analytics tool (QlikView, QlikSense, Tableau) Knowledge of HTML, XML, JSON, Postman, REST API, MS Excel is a plus. Ability to develop large scale web/database applications Ability to simultaneously work on multiple projects effectively Ability to communicate clearly with business users and project manager Ability to innovate and provide functional applications with intuitive interfaces Ability to interact with individuals at all levels of the organization Ability to share knowledge and work effectively in a team Consistently meet client expectations and project deadlines Good interpersonal, organizational skills Strong commitment to client service excellence Work Location: Hyderabad Shift Timings: 11:00 AM to 8:00 PM || 2:00 PM to 11:00 PM #CA-GSD Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304050 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role: AWS Data Engineer Work timing: 2 pm to 11 pm Work Mode: Hybrid Work Location: Chennai & Hyderabad Primary Skills: Data Engineer & AWS Detailed JD Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS, RDS, Redshift, Cloud Formation and other AWS serverless resources Responsibility We are seeking a highly skilled Senior AWS Developer to join our team as a Senior Consultant. With a primary focus on Pega and SQL, the ideal candidate will also have experience with Agile methodologies. As a Senior AWS Developer, you will be responsible for optimizing data models for performance and efficiency, writing SQL queries to support data analysis and reporting, and designing and implementing ETL processes to load data into the data warehouse. You will also work with stakeholders to identify business needs and requirements for data-related projects and design and maintain the data architecture for all AWS data services. The ideal candidate will have at least 5 years of work experience and be comfortable working in a hybrid setting. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon has an exciting opportunity for a Business Intelligence Engineer to join our online retail team. The Retail team operates as a merchant in Amazon, the team owns functions like merchandising, marketing, inventory management, vendor management and program management as core functions. In this pivotal role, you’ll be supporting these functions with business intelligence you derive from our vast array of data and will play a role in the long term growth and success of Amazon in the APAC region. You will be working with stakeholders from Pricing Program to contribute to Amazon’s Pricing strategies, partnering with Vendor and Inventory managers to help improve product cost structures, supporting the marketing team to build their strategies by using extremely large volumes of complex data. You will be exploring datasets, writing complex SQL queries, building data pipelines and data visualization solutions with AWS Quicksight. You will be also building new Machine Learning models to predict the outcomes of key inputs. Key job responsibilities As a BI Engineer in the APAC Retail BI team, you will build constructive partnerships with key stakeholders that enable your business understanding and ability to develop true business insights and recommendations. You’ll have the opportunity to work with other BI experts locally and internationally to identify to learn and develop best practices, always applying a data- driven approach. Amazon is widely known for our obsession over customers. In this role your stakeholders will be counting on you to help us understand customer behaviour and improve our offerings. This role does include periodic reporting responsibilities, but it’s really much more diverse than that. If this role is right for you, you will enjoy the challenge of pivoting between ad-hoc pieces of analysis, reporting enhancement, new builds as well as working on long-term strategic projects to enhance the BI & Analytics capabilities in Amazon. Basic Qualifications 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Preferred Qualifications Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASCSPL - Karnataka Job ID: A3001827 Show more Show less
Posted 1 week ago
8.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Information Systems What You Will Do Let’s do this. Let’s change the world. n this vital role you will develop an insight driven sensing capability with a focus on revolutionizing decision making. In this role you will lead the technical delivery for this capability as part of a team data engineers and software engineers. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery while building this ground-breaking new capability for Amgen. You will drive the software engineering side of the product release and will deliver for the outcomes. Roles & Responsibilities: Lead delivery of overall product and product features from concept to end of life management of the product team comprising of technical engineers, product owners and data scientists to ensure that business, quality, and functional goals are met with each product release Drives excellence and quality for the respective product releases, collaborating with Partner teams. Impacts quality, efficiency and effectiveness of own team. Has significant input into priorities. Incorporate and prioritize feature requests into product roadmap; Able to translate roadmap into execution Design and implement usability, quality, and delivery of a product or feature Plan releases and upgrades with no impacts to business Hands on expertise in driving quality and best in class Agile engineering practices Encourage and motivate the product team to deliver innovative and exciting solutions with an appropriate sense of urgency Manages progress of work and addresses production issues during sprints Communication with partners to make sure goals are clear and the vision is aligned with business objectives Direct management and staff development of team members What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 8 to 10 years of Information Systems experience OR Bachelor’s degree and 10 to 14 years ofInformation Systems experience OR Diploma and 14 to 18 years of Information Systems experience Thorough understanding of modern web application development and delivery, Gen AI applications development, Data integration and enterprise data fabric concepts, methodologies, and technologies e.g. AWS technologies, Databricks Demonstrated experience in building strong teams with consistent practices. Demonstrated experience in navigating matrix organization and leading change. Prior experience writing business case documents and securing funding for product team delivery; Financial/Spend management for small to medium product teams is a plus. In-depth knowledge of Agile process and principles. Define success metrics for developer productivity metrics; on a monthly/quarterly basis analyze how the product team is performing against established KPI’s. Functional Skills: Leadership: Influences through Collaboration: Builds direct and behind-the-scenes support for ideas by working collaboratively with others. Strategic Thinking: Anticipates downstream consequences and tailors influencing strategies to achieve positive outcomes. Transparent Decision-Making: Clearly articulates the rationale behind decisions and their potential implications, continuously reflecting on successes and failures to enhance performance and decision-making. Adaptive Leadership: Recognizes the need for change and actively participates in technical strategy planning. Preferred Qualifications: Strong influencing skills, influence stakeholders and be able to balance priorities. Prior experience in vendor management. Prior hands-on experience leading full stack development using infrastructure cloud services (AWS preferred) and cloud-native tools and design patterns (Containers, Serverless, Docker, etc.) Experience with developing solutions on AWS technologies such as S3, EMR, Spark, Athena, Redshift and others Familiarity with cloud security (AWS /Azure/ GCP) Conceptual understanding of DevOps tools (Ansible/ Chef / Puppet / Docker /Jenkins) Professional Certifications AWS Certified Solutions Architect (preferred) Certified DevOps Engineer (preferred) Certified Agile Leader or similar (preferred) Soft Skills: Strong desire for continuous learning to pick new tools/technologies. High attention to detail is essential with critical thinking ability. Should be an active contributor on technological communities/forums Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Influences and energizes others toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable Established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Excellent organizational and time-management skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position: Data Analyst for Pure Storage Experience: 5+ years Work Mode: Hybrid, Bangalore initially Note: Candidate should have hands-on experience with Live Projects in their experience. Live project means – Actual project of SDLC, not support maintenance. We need a data analyst with Power BI, with quick view or Tableau hands-on. Responsibilities : Perform advanced analytics like cohort analysis, scenario analysis, time series analysis, and predictive analysis, creating powerful visualizations to communicate the results. Articulate assumptions, analyses, and interpretations of data in a variety of modes. Design data models that define how the tables, columns, and data elements from different sources are connected and stored, based on our reporting and analytics requirements. Work closely with BI engineers to develop efficient, highly performant, scalable reporting and analytics solutions. Query data from warehouses like Snowflake using SQL. Validate and QA data to ensure consistent data accuracy and quality. Troubleshoot data issues and conduct root cause analysis when reporting data is in question. Requirements 5+ years relevant experience in Data Analytics, BI Analytics, or BI Engineering, preferably with a globally recognized organization. Expert-level skills in writing complex SQL queries to create views in warehouses like Snowflake, Redshift, SQL Server, Oracle, and BigQuery. Advanced skills in designing and creating data models and dashboards in BI tools like Tableau, Domo, Looker, etc. Intermediate level skills in analytical tools like Excel, Google Sheets, or Power BI (complex formulas, lookups, pivots, etc.) Bachelor’s/Advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Math, Statistics, or a similar field of study. Willingness to work with internal team members and stakeholders in other time zones. Interested candidates can connect with me at https://www.linkedin.com/in/amit-mukherjee-head-talent-acquisition-professional/ or can share their resume at amitm@acrocorp.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Bachelor's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with Business SMEs, Data Engineers and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Conversational AI product team Become a domain authority in Conversational AI technology capabilities by researching, deploying, and sustaining features built according to Amgens Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with UX to align technical requirements, scenarios, and business process maps with User Experience designs Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team Implement and supervise performance of Extract, Transform, and Load (ETL) jobs What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Specialist IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: Masters degree with 4 - 6 years of experience in Information Systems Bachelors degree with 6 - 8 years of experience in Information Systems Diploma with 10 - 12 years of experience in Information Systems Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.
The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.
In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect
Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming
As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.