Home
Jobs

1802 Redshift Jobs - Page 31

Filter Interviews
Min: 0 years
Max: 25 years
Min: β‚Ή0
Max: β‚Ή10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: Power BI Analyst Location : On-site Employment Type : Full-time Role Summary: We are seeking an experienced Senior BI Analyst to join our data analytics team, with a strong focus on migrating legacy Qlik dashboards to Power BI . This role requires deep expertise in Power BI , SQL , and preferably experience in the healthcare domain . Familiarity with Snowflake as a data warehouse platform is a strong plus. Key Responsibilities: Lead the migration of dashboards and reports from QlikView/Qlik Sense to Power BI , ensuring consistency in data logic, design, and user experience. Design, build, and optimize scalable, interactive Power BI dashboards to support key business decisions. Write complex SQL queries for data extraction, transformation, and validation. Collaborate with business users, analysts, and data engineers to gather requirements and deliver analytics solutions. Leverage data modeling and DAX to build robust and reusable datasets in Power BI. Perform data validation and QA to ensure accuracy during and post-migration. Work closely with Snowflake-based datasets or assist in transitioning data sources to Snowflake where applicable. Translate healthcare data metrics into actionable insights and visualizations. Required Skills: 4+ years of experience in Business Intelligence or Data Analytics roles Strong expertise in Power BI – including DAX, Power Query, custom visuals, row-level security Hands-on experience with QlikView or Qlik Sense , especially in migration scenarios Advanced proficiency in SQL – complex joins, performance tuning, and stored procedures Exposure to Snowflake or similar cloud data platforms (e.g., Redshift, BigQuery) Experience working with healthcare datasets (claims, clinical, EMR/EHR data, etc.) is a strong advantage Strong analytical and problem-solving mindset Effective communication and stakeholder management skills Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Minimum Required Experience : 2 years Skills Data Pipelines Databases Problem Solving & Coding Architecture Data Warehouse Description We are looking for an experienced Data Engineer having experience in building large-scale data pipelines and data lake ecosystems. Our daily work is around solving interesting and exciting problems against high engineering standards. Even though you will be a part of the backend team, you will be working with cross-functional teams across the org. This role demands good hands-on different programming languages, especially Python, and the knowledge of technologies like Kafka, AWS Glue, Cloudformation, ECS, etc. You will be spending most of your time on facilitating seamless streaming, tracking, and collaborating huge data sets. This is a back-end role, but not limited to it. You will be working closely with producers and consumers of the data and build optimal solutions for the organization. Will appreciate a person with lots of patience and data understanding. Also, we believe in extreme ownership! ● Design and build systems to efficiently move data across multiple systems and make it available for various teams like Data Science, Data Analytics, and Product. ● Design, construct, test, and maintain data management systems. ● Understand data and business metrics required by the product and architect the systems to make that data available in a usable/queryable manner. ● Ensure that all systems meet the business/company requirements as well as best industry practices. ● Keep ourselves abreast of new technologies in our domain. ● Recommend different ways to constantly improve data reliability and quality. ● Bachelors/Masters, Preferably in Computer Science or a related technical field. ● 2-5 years of relevant experience. ● Deep knowledge and working experience of Kafka ecosystem. ● Good programming experience, preferably in Python, Java, Go, and a willingness to learn more. ● Experience in working with large sets of data platforms. ● Strong knowledge of microservices, data warehouse, and data lake systems in the cloud, especially AWS Redshift, S3, and Glue. ● Strong hands-on experience in writing complex and efficient ETL jobs. ● Experience in version management systems (preferably with Git). ● Strong analytical thinking and communication. ● Passion for finding and sharing best practices and driving discipline for superior data quality and integrity. ● Intellectual curiosity to find new and unusual ways of how to solve data management issues. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Location: Pune (On-Site) Experience Level: 8+ Years Employment Type: Full-time Job Summary: We are seeking a highly skilled Lead Data Engineer with a strong background in Python, Apache Spark, and Apache Airflow to join our growing data team. You will lead the design, development, and deployment of scalable data pipelines and systems, ensuring high data quality and reliability for advanced analytics, AI/ML, and business intelligence solutions. Key Responsibilities: Lead the end-to-end development of scalable, high-performance data pipelines using Python , Apache Spark , and Apache Airflow . Collaborate with data scientists, analysts, and other engineering teams to define data architecture and infrastructure strategies. Develop and maintain ETL/ELT workflows to ingest data from diverse structured and unstructured sources. Ensure data quality , governance , lineage , and observability across the data ecosystem. Optimize Spark jobs for performance and cost-efficiency on distributed systems. Design and implement data models , data lakes , and data warehouses (preferably on AWS, GCP, or Azure). Mentor and guide junior engineers on coding best practices, architectural decisions, and performance tuning. Monitor production workflows, troubleshoot failures, and implement preventive measures for data reliability. Drive the adoption of best practices in data engineering , code review , CI/CD , and infrastructure as code . Required Qualifications: 8+ years of experience in software/data engineering with a strong focus on data pipeline development . Expertise in Python programming for data processing, scripting, and orchestration. Hands-on experience with Apache Spark (PySpark, SparkSQL) for distributed data processing. Strong knowledge of Apache Airflow for workflow orchestration and scheduling. Proficiency with SQL and working with relational and NoSQL databases. Experience working with cloud platforms (AWS, Azure, or GCP) and tools like S3, Redshift, BigQuery, Databricks, etc. Experience with CI/CD pipelines , Docker , and version control (e.g., Git). Strong analytical and problem-solving skills with attention to detail. Preferred Qualifications: Experience with data lakehouse architecture and tools like Delta Lake , Iceberg , or Hudi . Familiarity with Kafka , Snowflake , or dbt is a plus. Exposure to data governance tools and frameworks. Understanding of ML pipelines , feature stores , or MLOps is a bonus. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities Job Requirements Designs and establishes secure and performant data architectures, enhancements, updates, and programming changes for portions and subsystems of data pipelines, repositories or models for structured/unstructured data. Analyzes design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution. Writes and executes complete testing plans, protocols, and documentation for assigned portion of data system or component; identifies and debugs, and creates solutions for issues with code and integration into data system architecture. Collaborates and communicates with project team regarding project progress and issue resolution. Represents the data engineering team for all phases of larger and more-complex development projects. Provides guidance and mentoring to less experienced staff members. Work Experience Knowledge & Skills Using data engineering tools, languages, frameworks to mine, cleanse and explore data. Fluent in NoSQL & relational based systems. Fluent in complex, distributed and massively parallel systems. Strong analytical and problem-solving skills with ability to represent complex algorithms in software. Designing data systems/solutions to manage complex data. Strong understanding of database technologies and management systems. Strong understanding of cloud-based systems/services. Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools. Excellent written and verbal communication skills; mastery in English and local language. Ability to effectively communicate product architectures, design proposals and negotiate options at management levels. Expertise in Spark, Python, Scala, AWS tech stack Databases: Redshift, Mongo DB, EMR Proficient in deploying artifacts using CI/CD Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Job Summary We are seeking an experienced and highly skilled Senior Data Engineer to design, build, and maintain robust, scalable data infrastructure. You will play a key role in architecting data solutions, optimizing pipelines, ensuring data quality, and enabling data-driven decision-making across the organization. This role is ideal for someone who is passionate about data engineering, has strong problem-solving abilities, and can mentor junior team members while collaborating with cross-functional teams. Key Responsibilities Design and implement scalable and reliable data pipelines (batch and streaming) using technologies such as Spark, Kafka, and Airflow. Architect and maintain data warehouse/lakehouse solutions (e.g., Snowflake, BigQuery, Redshift, or Databricks). Build ETL/ELT workflows to ingest, transform, and validate data from various internal and external sources. Collaborate with data scientists, analysts, and software engineers to ensure data needs are met efficiently and securely. Optimize database performance and data workflows for reliability and low latency. Ensure data quality, governance, lineage, and security through best practices and tooling. Mentor junior engineers and contribute to engineering best practices, code reviews, and process improvements. Work with DevOps or Platform teams to automate deployment, monitoring, and alerting of data infrastructure. Maintain documentation of data architecture, systems, and processes. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field. 7+ years of experience in Data Engineering or a related role. Strong proficiency in SQL and at least one programming language such as Python, Scala, or Java. Deep experience with data pipeline orchestration tools (e.g., Apache Airflow, Luigi, dbt). Hands-on experience with cloud platforms like AWS, GCP, or Azure, particularly with services like S3, Lambda, EMR, Glue, BigQuery, or Snowflake. Solid understanding of data modeling, data warehousing, and distributed computing systems (e.g., Hadoop, Spark). Experience with CI/CD practices and tools (e.g., GitHub Actions, Jenkins, Terraform). Strong understanding of data governance, including data privacy, compliance, and security best practices. Show more Show less

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) - Experience with scripting language (e.g., Python, Java, or R) Amazon has an exciting opportunity for a Business Intelligence Engineer to join our online retail team. The Retail team operates as a merchant in Amazon, the team owns functions like merchandising, marketing, inventory management, vendor management and program management as core functions. In this pivotal role, you’ll be supporting these functions with business intelligence you derive from our vast array of data and will play a role in the long term growth and success of Amazon in the APAC region. You will be working with stakeholders from Pricing Program to contribute to Amazon’s Pricing strategies, partnering with Vendor and Inventory managers to help improve product cost structures, supporting the marketing team to build their strategies by using extremely large volumes of complex data. You will be exploring datasets, writing complex SQL queries, building data pipelines and data visualization solutions with AWS Quicksight. You will be also building new Machine Learning models to predict the outcomes of key inputs. Key job responsibilities As a BI Engineer in the APAC Retail BI team, you will build constructive partnerships with key stakeholders that enable your business understanding and ability to develop true business insights and recommendations. You’ll have the opportunity to work with other BI experts locally and internationally to identify to learn and develop best practices, always applying a data- driven approach. Amazon is widely known for our obsession over customers. In this role your stakeholders will be counting on you to help us understand customer behaviour and improve our offerings. This role does include periodic reporting responsibilities, but it’s really much more diverse than that. If this role is right for you, you will enjoy the challenge of pivoting between ad-hoc pieces of analysis, reporting enhancement, new builds as well as working on long-term strategic projects to enhance the BI & Analytics capabilities in Amazon. Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Support the implementation of a modern data framework that facilitates business intelligence reporting and advanced analytics Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Epic certifications in one or more of the following modules: Caboodle, EpicCare, Grand Central, Healthy Planet, HIM, Prelude, Resolute, Tapestry, or Reporting Workbench 4+ years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts 2+ years of experience with Cerner Millennium / HealthEintent and experience using Cerner CCL 2+ years of experience working with Health Catalyst product offerings, including data warehousing solutions, knowledgebase, and analytics solutions Experience in Unix or Powershell or other batch scripting languages Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI, Qlik, Tableau, MicroStrategy, etc.) Experience supporting analytical capabilities inclusive of reporting, dashboards, extracts, BI tools, analytical web applications and other similar products Experience contributing to cross-functional efforts with proven success in creating healthcare insights Experience and credibility interacting with analytics and technology leadership teams Exposure to Azure, AWS, or google cloud ecosystems Exposure to Amazon Redshift, Amazon S3, Hadoop HDFS, Azure Blob, or similar big data storage and management components Desire to continuously learn and seek new options and approaches to business challenges A willingness to leverage best practices, share knowledge, and improve the collective work of the team Ability to effectively communicate concepts verbally and in writing Willingness to support limited travel up to 10% At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more careers.bms.com/working-with-us . Position Senior Data Engineer Location Hyderabad, India At Bristol Myers Squibb, we are inspired by a single vision - transforming patients' lives through science. In oncology, hematology, immunology, and cardiovascular disease - and one of the most diverse and promising pipelines in the industry - each of our passionate colleagues contribute to innovations that drive meaningful change. We bring a human touch to every treatment we pioneer. Join us and make a difference. Position Summary At BMS, digital innovation and Information Technology are central to our vision of transforming patients' lives through science. To accelerate our ability to serve patients around the world, we must unleash the power of technology. We are committed to being at the forefront of transforming the way medicine is made and delivered by harnessing the power of computer and data science, artificial intelligence, and other technologies to promote scientific discovery, faster decision making, and enhanced patient care. If you want an exciting and rewarding career that is meaningful, consider joining our diverse team! As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains. The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data. Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organization's data needs. Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion. Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy. Serves as the Subject Matter Expert on Data & Analytics Solutions. Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment. Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed. In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem. Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc.), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc. Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc. Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site. Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team. Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion. Why You Should Apply Around the world, we are passionate about making an impact on the lives of patients with serious diseases. Empowered to apply our individual talents and diverse perspectives in an inclusive culture, our shared values of passion, innovation, urgency, accountability, inclusion, and integrity bring out the highest potential of each of our colleagues. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Our company is committed to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace adjustments and ongoing support in their roles. Applicants can request an accommodation prior to accepting a job offer. If you require reasonable accommodation in completing this application, or any part of the recruitment process direct your inquiries to adastaffingsupport@bms.com. Visit careers.bms.com/eeo-accessibility to access our complete Equal Employment Opportunity statement. If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through scienceβ„’ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information https //careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Embark on a transformative journey as a Software Engineer – Integration (Linux) at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. Operational Support Systems (OSS) Platform Engineering is a new team within the newly formed Network OSS & Tools functional unit in the Network Product domain at Barclays. The Barclays OSS Platform Engineering team is responsible for the design, build and run of the underlying OSS infrastructure and toolchain across cloud and on-prem that the core systems and tools required to run the Barclays Global Network reside on. Skills To be successful in this role as a Linux focused Integration Software Engineer – Integration (Linux), you should possess the following skillsets: Strong Linux proficiency and expertise with containerization and Kubernetes with programming expertise in one of the high-level languages like Python, Java, Golang and NetDevOps automation. Hands-on expertise with IaC, Cloud Platforms, CI/CD Pipelines for Data, Containerization & Orchestration and SRE principles. Strong Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways etc) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Some Other Highly Valued Skills Include Expertise building ELT pipelines and cloud/storage integrations - data lake/warehouse integrations (redshift, BigQuery, Snowflake etc). Solid understanding of DevOps tooling, GitOps, CI/CD, config management, Jenkins, build pipelines and source control systems. Working knowledge of cloud infrastructure services: compute, storage, networking, hybrid connectivity, monitoring/logging, security and IAM. SRE Experience. Expertise building and defining KPI’s (SLI/SLO’s) using open-source tooling like ELK, Prometheus and various other instrumentation, telemetry, and log analytics. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Pune office. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Embark on a transformative journey as a Software Engineer-Full Stack Developer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences.Operational Support Systems (OSS) Platform Engineering is a new team within the newly formed Network OSS & Tools functional unit in the Network Product domain at Barclays. The Barclays OSS Platform Engineering team is responsible for the design, build and run of the underlying OSS infrastructure and toolchain across cloud and on-prem that the core systems and tools required to run the Barclays Global Network reside on. To be successful in this role as a Software Engineer -Full Stack Developer you should possess the following skillsets: Demonstrable expertise with front-end and back-end skillsets. Java Proficiency and Spring Ecosystem (Spring MVC, Data JPA, Security etc) with strong SQL and NoSQL integration expertise. React.js and javascript expertise : material UI, Ant design and state management expertise (Redus, Zustand or Context API). Strong knowledge of runtime (virtualisation, containers and Kubernetes) and expertise with test driven development using frameworks like cypress, playwright, selenium etc. Strong knowledge of CI/CD pipelines and tooling : Github Actions, Jenkins, Gitlab CI or similar. Monitoring and Observability - logging/tracing and alerting with knowledge of SRE integrations into opensource tooling like grafana/ELK etc. Some Other Highly Valued Skills Include Expertise building ELT pipelines and cloud/storage integrations - data lake/warehouse integrations (redshift, BigQuery, snowflake etc). Expertise with security (OAuth2, CSRF/XSS protection), secure coding practice and Performance Optimization - JVM tuning, performance profiling, caching, lazy loading, rate limiting and high availability in large datasets. Expertise in Public, Private and Hybrid Cloud technologies (DC, AWS, Azure, GCP etc) and across broad Network domains (physical and wireless) - WAN/SD-WAN/LAN/WLAN etc. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Pune office. Purpose of the role To lead and manage engineering teams, providing technical guidance, mentorship, and support to ensure the delivery of high-quality software solutions, driving technical excellence, fostering a culture of innovation, and collaborating with cross-functional teams to align technical decisions with business objectives. Accountabilities Lead engineering teams effectively, fostering a collaborative and high-performance culture to achieve project goals and meet organizational objectives. Oversee timelines, team allocation, risk management and task prioritization to ensure the successful delivery of solutions within scope, time, and budget. Mentor and support team members' professional growth, conduct performance reviews, provide actionable feedback, and identify opportunities for improvement. Evaluation and enhancement of engineering processes, tools, and methodologies to increase efficiency, streamline workflows, and optimize team productivity. Collaboration with business partners, product managers, designers, and other stakeholders to translate business requirements into technical solutions and ensure a cohesive approach to product development. Enforcement of technology standards, facilitate peer reviews, and implement robust testing practices to ensure the delivery of high-quality solutions. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Embark on a transformative journey as a Software Engineer -Integration (cloud) at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. Operational Support Systems (OSS) Platform Engineering is a new team within the newly formed Network OSS & Tools functional unit in the Network Product domain at Barclays. The Barclays OSS Platform Engineering team is responsible for the design, build and run of the underlying OSS infrastructure and toolchain across cloud and on-prem that the core systems and tools required to run the Barclays Global Network reside on. To be successful as a Software Engineer- Integration (cloud), you should possess the following skillsets: Deep Expertise in Cloud platforms (AWS, Azure or GCP) infrastructure design and cost optimization. An expert in containerization and Orchestration using dockers and Kubernetes (deployments, service mesh etc.) Hands-on expertise with platform engineering and productization (for other app consumption as tenants) of opensource monitoring/logging tools (Prometheus, Grafana, ELK and similar) and cloud-native tools based. Strong Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways etc) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Some Other Highly Valued Skills Include Expertise building ELT pipelines and cloud/storage integrations - data lake/warehouse integrations (redshift, BigQuery, Snowflake etc). Solid understanding of DevOps tooling, GitOps, CI/CD, config management, Jenkins, build pipelines and source control systems. Working knowledge of cloud infrastructure services: compute, storage, networking, hybrid connectivity, monitoring/logging, security and IAM. SRE Experience. Expertise building and defining KPI’s (SLI/SLO’s) using open-source tooling like ELK, Prometheus and various other instrumentation, telemetry, and log analytics. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Pune office. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Embark on a transformative journey as a Software Engineer – Integration (Linux) at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. Operational Support Systems (OSS) Platform Engineering is a new team within the newly formed Network OSS & Tools functional unit in the Network Product domain at Barclays. The Barclays OSS Platform Engineering team is responsible for the design, build and run of the underlying OSS infrastructure and toolchain across cloud and on-prem that the core systems and tools required to run the Barclays Global Network reside on. Skills To be successful in this role as a Linux focused Integration Software Engineer – Integration (Linux), you should possess the following skillsets: Strong Linux proficiency and expertise with containerization and Kubernetes with programming expertise in one of the high-level languages like Python, Java, Golang and NetDevOps automation. Hands-on expertise with IaC, Cloud Platforms, CI/CD Pipelines for Data, Containerization & Orchestration and SRE principles. Strong Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways etc) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Some Other Highly Valued Skills Include Expertise building ELT pipelines and cloud/storage integrations - data lake/warehouse integrations (redshift, BigQuery, Snowflake etc). Solid understanding of DevOps tooling, GitOps, CI/CD, config management, Jenkins, build pipelines and source control systems. Working knowledge of cloud infrastructure services: compute, storage, networking, hybrid connectivity, monitoring/logging, security and IAM. SRE Experience. Expertise building and defining KPI’s (SLI/SLO’s) using open-source tooling like ELK, Prometheus and various other instrumentation, telemetry, and log analytics. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Pune office. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Embark on a transformative journey as a Software Engineer-Full Stack Developer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences.Operational Support Systems (OSS) Platform Engineering is a new team within the newly formed Network OSS & Tools functional unit in the Network Product domain at Barclays. The Barclays OSS Platform Engineering team is responsible for the design, build and run of the underlying OSS infrastructure and toolchain across cloud and on-prem that the core systems and tools required to run the Barclays Global Network reside on. To be successful in this role as a Software Engineer -Full Stack Developer you should possess the following skillsets: Demonstrable expertise with front-end and back-end skillsets. Java Proficiency and Spring Ecosystem (Spring MVC, Data JPA, Security etc) with strong SQL and NoSQL integration expertise. React.js and javascript expertise : material UI, Ant design and state management expertise (Redus, Zustand or Context API). Strong knowledge of runtime (virtualisation, containers and Kubernetes) and expertise with test driven development using frameworks like cypress, playwright, selenium etc. Strong knowledge of CI/CD pipelines and tooling : Github Actions, Jenkins, Gitlab CI or similar. Monitoring and Observability - logging/tracing and alerting with knowledge of SRE integrations into opensource tooling like grafana/ELK etc. Some Other Highly Valued Skills Include Expertise building ELT pipelines and cloud/storage integrations - data lake/warehouse integrations (redshift, BigQuery, snowflake etc). Expertise with security (OAuth2, CSRF/XSS protection), secure coding practice and Performance Optimization - JVM tuning, performance profiling, caching, lazy loading, rate limiting and high availability in large datasets. Expertise in Public, Private and Hybrid Cloud technologies (DC, AWS, Azure, GCP etc) and across broad Network domains (physical and wireless) - WAN/SD-WAN/LAN/WLAN etc. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Pune office. Purpose of the role To lead and manage engineering teams, providing technical guidance, mentorship, and support to ensure the delivery of high-quality software solutions, driving technical excellence, fostering a culture of innovation, and collaborating with cross-functional teams to align technical decisions with business objectives. Accountabilities Lead engineering teams effectively, fostering a collaborative and high-performance culture to achieve project goals and meet organizational objectives. Oversee timelines, team allocation, risk management and task prioritization to ensure the successful delivery of solutions within scope, time, and budget. Mentor and support team members' professional growth, conduct performance reviews, provide actionable feedback, and identify opportunities for improvement. Evaluation and enhancement of engineering processes, tools, and methodologies to increase efficiency, streamline workflows, and optimize team productivity. Collaboration with business partners, product managers, designers, and other stakeholders to translate business requirements into technical solutions and ensure a cohesive approach to product development. Enforcement of technology standards, facilitate peer reviews, and implement robust testing practices to ensure the delivery of high-quality solutions. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Amazon is seeking a Business Analyst to join the Abuse Prevention vertical of India Returns team. The team's vertical is focused on eliminating abuse and misuse associated with customer returns and rejects, thereby improving the India business P&L. This role would drive analytics support for product managers and business intelligence engineers for key strategic priorities. The position represents an exciting opportunity to be a part of a high-paced environment. The ideal candidate will be detail-oriented, well-versed with SQL (Python is a plus), and driven to provide insightful and timely data-based insights. The candidate should have strong analytical and communication skills. There is always room to make things better so this candidate should also have the ability to invent and simplify. Lastly, the candidate should have an ability to work effectively with cross-functional teams. Key job responsibilities Key job responsibilities This Person Will Own The Production And Delivery Of Suite Of Analytics Reports And Dashboards Used By The Team To Make Key Business Decisions. This Will Involve Build the data structure, transformation processes, load jobs in Redshift and data processing and presentation in Excel Debugging report issues and unblocking workflows Communicating with the Product team and customers to provide status updates. Publishing detailed automated dashboards Creating the report requires extracting & transforming data from tables and loading it into tables with an optimized data structure Basic Qualifications Bachelor's degree in mathematics, engineering, statistics, computer science or a related field 1+ years of business analysis (dealing with large complex data) experience Demonstrated ability with Data-warehousing, database administrator roles, database migration Strong experience in dash-boarding using Tableue/PowerBI/Excel/PowerPivots Strong communication skill and team player Demonstrated ability to manage and prioritize workload Preferred Qualifications Knowledge of scripting for automation (e.g., VB Script, Python, Perl, Ruby) is a plus Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A3000618 Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE We are seeking a highly skilled, hands-on Senior QA & Test Automation Specialist ( T e st Automation Engineer ) with strong experience in data validation , ETL testing , test automation , and QA process ownership . This role combines deep technical execution with a solid foundation in QA best practices including test planning, defect tracking, and test lifecycle management . You will be responsible for designing and executing manual and automated test strategies for complex real-time and batch data pipelines , contributing to the design of automation frameworks , and ensuring high-quality data delivery across our AWS and Databricks-based analytics platforms . The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Collaborate with the QA Manager to design and implement end-to-end test strategies for data validation, semantic layer testing, and GraphQL API validation. Perform manual validation of data pipelines, including source-to-target data mapping, transformation logic, and business rule verification. Develop and maintain automated data validation scripts using Python and PySpark for both real-time and batch pipelines. Contribute to the design and enhancement of reusable automation frameworks, with components for schema validation, data reconciliation, and anomaly detection. Validate semantic layers (e.g., Looker, dbt models) and GraphQL APIs, ensuring data consistency, compliance with contracts, and alignment with business expectations. Write and manage test plans, test cases, and test data for structured, semi-structured, and unstructured data. Track, manage, and report defects using tools like JIRA, ensuring thorough root cause analysis and timely resolution. Collaborate with Data Engineers, Product Managers, and DevOps teams to integrate tests into CI/CD pipelines and enable shift-left testing practices. Ensure comprehensive test coverage for all aspects of the data lifecycle, including ingestion, transformation, delivery, and consumption. Participate in QA ceremonies (standups, planning, retrospectives) and continuously contribute to improving the QA process and culture. Experience building or maintainingtest data generators Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Must-Have Skills: 6-9 years of experience in QA roles, with at least 3+ yearsof strong exposure to data pipeline testing and ETL validation. Strong in SQL, Python, and optionally PySpark comfortable with writing complex queries and validation scripts. Practical experience with manual validation of data pipelines and source-to-target testing. Experience in validating GraphQL APIs, semantic layers (Looker, dbt, etc.), and schema/data contract compliance. Familiarity with data integration tools and platforms such as Databricks, AWS Glue, Redshift, Athena, or BigQuery. Strong understanding of test planning, defect tracking, bug lifecycle management, and QA documentation. Experience working in Agile/Scrum environments with standard QA processes. Knowledge of test case and defect management tools (e.g., JIRA, TestRail, Zephyr). Strong understanding of QA methodologies, test planning, test case design, and defect lifecycle management. Deep hands-on expertise in SQL, Python, and PySpark for testing and automating validation. Proven experience in manual and automated testing of batch and real-time data pipelines. Familiarity with data processing and analytics stacks: Databricks, Spark, AWS (Glue, S3, Athena, Redshift). Experience with bug tracking and test management tools like JIRA, TestRail, or Zephyr. Ability to troubleshoot data issues independently and collaborate with engineering for root cause analysis. Experience integrating automated tests into CI/CD pipelines (e.g., Jenkins, GitHub Actions). Experience validating data from various file formats such as JSON, CSV, Parquet, and Avro Strong ability to validate and automate data quality checks: schema validation, null checks, duplicates, thresholds, and transformation validation Hands-on experience with API testing using Postman, pytest, or custom automation scripts Good-to-Have Skills: Experience with data governance tools such as Apache Atlas, Collibra, or Alation Familiarity with monitoring/observability tools such as Datadog, Prometheus, or CloudWatch Education and Professional Certifications Bachelors/Masters degree in computer science and engineering preferred. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: We are seeking a highly skilled, hands-on Senior QA & Test Automation Specialist (Test Automation Engineer)with strong experience in data validation , ETL testing , test automation , and QA process ownership . This role combines deep technical execution with a solid foundation in QA best practices including test planning, defect tracking, and test lifecycle management . You will be responsible for designing and executing manual and automated test strategies for complex real-time and batch data pipelines , contributing to the design of automation frameworks , and ensuring high-quality data delivery across our AWS and Databricks-based analytics platforms . The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Collaborate with the QA Manager to design and implement end-to-end test strategies for data validation, semantic layer testing, and GraphQL API validation. Perform manual validation of data pipelines, including source-to-target data mapping, transformation logic, and business rule verification. Develop and maintain automated data validation scripts using Python and PySpark for both real-time and batch pipelines. Contribute to the design and enhancement of reusable automation frameworks, with components for schema validation, data reconciliation, and anomaly detection. Validate semantic layers (e.g., Looker, dbt models) and GraphQL APIs, ensuring data consistency, compliance with contracts, and alignment with business expectations. Write and manage test plans, test cases, and test data for structured, semi-structured, and unstructured data. Track, manage, and report defects using tools like JIRA, ensuring thorough root cause analysis and timely resolution. Collaborate with Data Engineers, Product Managers, and DevOps teams to integrate tests into CI/CD pipelines and enable shift-left testing practices. Ensure comprehensive test coverage for all aspects of the data lifecycle, including ingestion, transformation, delivery, and consumption. Participate in QA ceremonies (standups, planning, retrospectives) and continuously contribute to improving the QA process and culture. Experience building or maintaining test data generators Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Must-Have Skills: 69 years of experience in QA roles, with at least 3+ years of strong exposure to data pipeline testing and ETL validation. Strong in SQL, Python, and optionally PySpark comfortable with writing complex queries and validation scripts. Practical experience with manual validation of data pipelines and source-to-target testing. Experience in validating GraphQL APIs, semantic layers (Looker, dbt, etc.), and schema/data contract compliance. Familiarity with data integration tools and platforms such as Databricks, AWS Glue, Redshift, Athena, or BigQuery. Strong understanding of test planning, defect tracking, bug lifecycle management, and QA documentation. Experience working in Agile/Scrum environments with standard QA processes. Knowledge of test case and defect management tools (e.g., JIRA, TestRail, Zephyr). Strong understanding of QA methodologies, test planning, test case design, and defect lifecycle management. Deep hands-on expertise in SQL, Python, and PySpark for testing and automating validation. Proven experience in manual and automated testing of batch and real-time data pipelines. Familiarity with data processing and analytics stacks: Databricks, Spark, AWS (Glue, S3, Athena, Redshift). Experience with bug tracking and test management tools like JIRA, TestRail, or Zephyr. Ability to troubleshoot data issues independently and collaborate with engineering for root cause analysis. Experience integrating automated tests into CI/CD pipelines (e.g., Jenkins, GitHub Actions). Experience validating data from various file formats such as JSON, CSV, Parquet, and Avro Strong ability to validate and automate data quality checks: schema validation, null checks, duplicates, thresholds, and transformation validation Hands-on experience with API testing using Postman, pytest, or custom automation scripts Good-to-Have Skills: Experience with data governance tools such as Apache Atlas, Collibra, or Alation Familiarity with monitoring/observability tools such as Datadog, Prometheus, or Cloud Watch Education and Professional Certifications Bachelors/Masters degree in computer science and engineering preferred. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

8.0 - 12.0 years

32 - 37 Lacs

Hyderabad

Work from Office

Naukri logo

Job Overview: As Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operatinghighly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. Responsibilities Be a founding member of the data engineering team. Help to attract talent to the team by networking with your peers, by representing PepsiCo HBS at conferences and other events, and by discussing our values and best practices when interviewing candidates. Own data pipeline development end-to-end, spanning data modeling, testing, scalability, operability and ongoing metrics. Ensure that we build high quality software by reviewing peer code check-ins. Define best practices for product development, engineering, and coding as part of a world class engineering team. Collaborate in architecture discussions and architectural decision making that is part of continually improving and expanding these platforms. Lead feature development in collaboration with other engineers; validate requirements / stories, assess current system capabilities, and decompose feature requirements into engineering tasks. Focus on delivering high quality data pipelines and tools through careful analysis of system capabilities and feature requests, peer reviews, test automation, and collaboration with other engineers. Develop software in short iterations to quickly add business value. Introduce new tools / practices to improve data and code quality; this includes researching / sourcing 3rd party tools and libraries, as well as developing tools in-house to improve workflow and quality for all data engineers. Support data pipelines developed by your teamthrough good exception handling, monitoring, and when needed by debugging production issues. Qualifications 6-9 years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience in SQL optimization and performance tuning Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with data profiling and data quality tools like Apache Griffin, Deequ, or Great Expectations. Current skills in following technologies: Python Orchestration platforms: Airflow, Luigi, Databricks, or similar Relational databases: Postgres, MySQL, or equivalents MPP data systems: Snowflake, Redshift, Synapse, or similar Cloud platforms: AWS, Azure, or similar Version control (e.g., GitHub) and familiarity with deployment, CI/CD tools. Fluent with Agile processes and tools such as Jira or Pivotal Tracker Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus.

Posted 1 week ago

Apply

6.0 - 11.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Interview Drive – Saturday Hiring Event We are hiring passionate and skilled professionals for the following positions. If you're ready to take the next step in your career, join us for an exclusive interview drive this Saturday in Pune ! πŸ“Œ 1. Python Developer Experience: 6 to 11 Years Location: Pune Key Skills Required: Strong experience in Python-based application development Proficiency in Pandas or Django frameworks Hands-on experience with REST APIs Experience working with AWS services – S3, Glue, Lambda, Redshift Good problem-solving and debugging skills Ability to work in an agile environment πŸ“Œ 2. Python Lead Experience: 9 to 15 Years Location: Pune Key Skills Required: Expertise in Python application development using Pandas/Django Strong understanding and experience with REST APIs Proficient in AWS services – S3, Glue, Lambda, Redshift Designing experience at the application or system level Proven ability in leading and mentoring a development team Excellent communication and leadership skills πŸ“… Date: This Saturday (7th June 2025) πŸ“ Location: Pune (Details will be shared upon shortlisting) πŸ“© Interested candidates can share their updated resumes at rupali@nexionpro.com ⏳ Shortlisted profiles will receive confirmation with interview slot details. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they β€œstand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description Job Title: Data Engineer - PySpark, Python, SQL, Git, AWS Services – Glue, Lambda, Step Functions, S3, Athena. Job Description: We are seeking a talented Data Engineer with expertise in PySpark, Python, SQL, Git, and AWS to join our dynamic team. The ideal candidate will have a strong background in data engineering, data processing, and cloud technologies. You will play a crucial role in designing, developing, and maintaining our data infrastructure to support our analytics. Responsibilities: Develop and maintain ETL pipelines using PySpark and AWS Glue to process and transform large volumes of data efficiently. Collaborate with analysts to understand data requirements and ensure data availability and quality. Write and optimize SQL queries for data extraction, transformation, and loading. Utilize Git for version control, ensuring proper documentation and tracking of code changes. Design, implement, and manage scalable data lakes on AWS, including S3, or other relevant services for efficient data storage and retrieval. Develop and optimize high-performance, scalable databases using Amazon DynamoDB. Proficiency in Amazon QuickSight for creating interactive dashboards and data visualizations. Automate workflows using AWS Cloud services like event bridge, step functions. Monitor and optimize data processing workflows for performance and scalability. Troubleshoot data-related issues and provide timely resolution. Stay up-to-date with industry best practices and emerging technologies in data engineering. Qualifications: Bachelor's degree in Computer Science, Data Science, or a related field. Master's degree is a plus. Strong proficiency in PySpark and Python for data processing and analysis. Proficiency in SQL for data manipulation and querying. Experience with version control systems, preferably Git. Familiarity with AWS services, including S3, Redshift, Glue, Step Functions, Event Bridge, CloudWatch, Lambda, Quicksight, DynamoDB, Athena, CodeCommit etc. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively within a team. Ability to manage multiple tasks and prioritize effectively in a fast-paced environment. Preferred Skills: Knowledge of data warehousing concepts and data modeling. Familiarity with big data technologies like Hadoop and Spark. AWS certifications related to data engineering. Join our team and contribute to our mission of turning data into actionable insights. If you're a motivated data engineer with expertise in PySpark, Python, SQL, Git, and AWS, we want to hear from you. Apply now to be part of our innovative and dynamic data engineering team. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. About the role:- This role will be part of a team that develops software that processes data captured every day from over a quarter of a million Computer and Mobile devices worldwide. Measuring panelists activities as they surf the Internet via Browsers, or utilizing Mobile App’s download from Apple’s and Google’s store. The Nielsen software meter used to capture this usage data has been optimized to be unobtrusive yet gather many biometric data points that the backend system can use to identify who is using the device, and also detect fraudulent behavior. The Software Engineer is ultimately responsible for delivering technical solutions: starting from the project's onboard until post launch support and including development and testing. It is expected to coordinate, support and work with multiple delocalized project teams in multiple regions. As a Software Engineer with our Digital Meter Processing team, you will further develop the backend system that processes massive amounts of data every day, across 3 different AWS regions. Your role will involve implementing, and maintaining robust, scalable solutions that leverage a Java based system that runs in an AWS environment. You will play a key role in shaping the technical direction of our projects and mentoring other team members. Responsibilities:- System Deployment: Build new features in the existing backend processing pipelines. CI/CD Implementation: Leverage CI/CD pipelines for automated build, test, and deployment processes. Ensure continuous integration and delivery of features, improvements, and bug fixes. Code Quality and Best Practices: Adhere to coding standards, best practices, and design principles. Participate in code reviews and provide constructive feedback to maintain high code quality. Performance Optimization: Identify and address performance bottlenecks in both reading, processing and writing data to the backend data stores. Team Collaboration: Follow best practices. Collaborate with cross-functional teams to ensure a cohesive and unified approach to software development. Security and Compliance: Implement security best practices for all tiers of the system. Ensure compliance with industry standards and regulations related to AWS platform security. Key Skills:- Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field. Proven experience, minimum 2 years, in Java development expertise and scripting languages such as Python in an AWS Cloud environment. Good experience with SQL and a database system such as Postgres. Good understanding of CI/CD principles and tools. GitLab a plus. Good problem-solving and debugging skills. Good communication and collaboration skills with ability to communicate complex technical concepts and align organization on decisions. Utilizes team collaboration to contribute to innovative solutions efficiently Other desirable skills:- Knowledge of networking principles and security best practices. AWS certifications. Experience with Data Warehouses, ETL, and/or Data Lakes very helpful. Experience with RedShift, Airflow, Python, Lambda, Prometheus, Grafana, & OpsGeni a bonus Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Responsibilities Lead simultaneous development for multiple business verticals. Design & develop highly scalable, reliable, secure, and fault-tolerant systems. Ensure that exceptional standards are maintained in all aspects of engineering. Collaborate with other engineering teams to learn and share best practices. Take ownership of technical performance metrics and strive actively to improve them. Mentors junior members of the team and contributes to code reviews. Requirements A passion to solve tough engineering/data challenges. Be well versed with cloud computing platforms AWS/GCP Experience with SQL technologies (MySQL, PostgreSQL) Experience working with NoSQL technologies (MongoDB, ElasticSearch) Excellent Programming skills in Python/Java/GoLang Big Data streaming services (Kinesis, Kafka, RabbitMQ) Distributed cache systems(Redis, Memcache) Advanced data solutions(BigQuery, RedShift, DynamoDB, Cassandra) Automated testing frameworks and CI/CD pipelines Infrastructure orchestration(Docker/Kubernetes/Nginx) Cloud-native tech like Lambda, ASG, CDN, ELB, SNS/SQS, S3 Route53 SES Skills:- MySQL, PostgreSQL, NOSQL Databases, MongoDB, Elastic Search, Amazon Web Services (AWS), Google Cloud Platform (GCP), Python, Java, Go Programming (Golang), Apache Kafka, RabbitMQ, Redis, DynamoDB, Cassandra, CI/CD and AWS Lambda Show more Show less

Posted 2 weeks ago

Apply

6.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

The Impact of a Sr. Software Engineer, Data to Coupa: The Sr. Software Engineer, Data is a key role at Coupa, responsible for designing, building, and maintaining the data infrastructure that powers our business. The individual will work closely with cross-functional teams, including Data Scientists, Product Managers, and Software Engineers, to develop data pipelines, transform raw data into usable formats, and ensure data quality and consistency across our platform. The Sr. Software Engineer, Data will be responsible for designing and implementing robust data architectures that can handle large and complex datasets, and for creating and maintaining data warehouses, data lakes, and other data storage solutions. Suitable candidates will have a strong background in data engineering, with experience in data modelling and ETL development. They will also have experience in programming languages such as Python, Java, as well as in cloud-based data storage and processing technologies such as AWS, Azure, or GCP. The impact of a skilled Sr. Software Engineer, Data will be significant, ensuring that our platform is powered by reliable and accurate data, and enabling us to deliver innovative solutions to our customers and partners. Their work will contribute to the overall success and growth of the company, enabling Coupa to continue to lead the market in cloud-based spend management solutions. What You'll Do: Create and maintain optimal data pipeline architecture Optimize Spark clusters for efficiency and performance by implementing robust monitoring systems to identify bottlenecks using data and metrics. Provide actionable recommendations for continuous improvement Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs Keep our data separated and secure across national boundaries through multiple data centers and AWS regions Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader Work with data and analytics experts to strive for greater functionality in our data systems What You Will Bring to Coupa: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Experience with processing large workloads and complex code on Spark clusters Proven experience in setting up monitoring for Spark clusters and driving optimization based on insights and findings Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs Experience with API development and design with REST or GraphQL. Experience building and optimizing big data data pipelines, architectures, and data sets Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets Build processes supporting data transformation, data structures, metadata, dependency, and workload management Working knowledge of message queuing, stream processing, and highly scalable big data data stores Strong project management and organizational skills Experience supporting and working with cross-functional teams in a dynamic environment We are looking for a candidate with 6-10 years of experience in a Senior Software Engineer - Data role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field Experience using the following software/tools: Experience with object-oriented/object function scripting languages: Python, Java, C++, .net, etc. Expertise in Python is a must Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift Working knowledge of stream-processing systems: Storm, Spark-Streaming, etc.

Posted 2 weeks ago

Apply

Exploring Redshift Jobs in India

The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.

Career Path

In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect

Related Skills

Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming

Interview Questions

  • What is Amazon Redshift and how does it differ from traditional databases? (basic)
  • How does data distribution work in Amazon Redshift? (medium)
  • Explain the difference between SORTKEY and DISTKEY in Redshift. (medium)
  • How do you optimize query performance in Amazon Redshift? (advanced)
  • What is the COPY command in Redshift used for? (basic)
  • How do you handle large data sets in Redshift? (medium)
  • Explain the concept of Redshift Spectrum. (advanced)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you monitor and manage Redshift clusters? (advanced)
  • Can you describe the architecture of Amazon Redshift? (medium)
  • What are the best practices for data loading in Redshift? (medium)
  • How do you handle concurrency in Redshift? (advanced)
  • Explain the concept of vacuuming in Redshift. (basic)
  • What are Redshift's limitations and how do you work around them? (advanced)
  • How do you scale Redshift clusters for performance? (medium)
  • What are the different node types available in Amazon Redshift? (basic)
  • How do you secure data in Amazon Redshift? (medium)
  • Explain the concept of Redshift Workload Management (WLM). (advanced)
  • What are the benefits of using Redshift over traditional data warehouses? (basic)
  • How do you optimize storage in Amazon Redshift? (medium)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you troubleshoot performance issues in Amazon Redshift? (advanced)
  • Can you explain the concept of columnar storage in Redshift? (basic)
  • How do you automate tasks in Redshift? (medium)
  • What are the different types of Redshift nodes and their use cases? (basic)

Conclusion

As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies