Jobs
Interviews

3632 Redshift Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

You should have 6-8 years of hands-on experience with Big Data technologies such as pySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should possess good hands-on experience with python and Bash Scripts, along with a solid understanding of SQL and data warehouse concepts. Strong analytical, problem-solving, data analysis, and research skills are crucial for this role. It is essential to have a demonstrable ability to think creatively and independently, beyond relying solely on readily available tools. Excellent communication, presentation, and interpersonal skills are a must for effective collaboration within the team. Hands-on experience with Cloud Platform provided Big Data technologies like IAM, Glue, EMR, RedShift, S3, and Kinesis is required. Experience in orchestrating with Airflow and any job scheduler is highly beneficial. Familiarity with migrating workloads from on-premise to cloud and cloud to cloud migrations is also desired. In this role, you will be responsible for developing efficient ETL pipelines based on business requirements while adhering to development standards and best practices. Integration testing of different pipelines in AWS environment and providing estimates for development, testing, and deployments on various environments will be part of your responsibilities. Participation in code peer reviews to ensure compliance with best practices is essential. Creating cost-effective AWS pipelines using necessary AWS services like S3, IAM, Glue, EMR, Redshift, etc., is a key aspect of this position. Your experience should range from 6 to 8 years in relevant fields. The job reference number for this position is 13024.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Data Engineering Architect at Iris Software, you will play a crucial role in leading enterprise-level data engineering projects on public cloud platforms like AWS, Azure, or GCP. Your responsibilities will include engaging with client managers to understand their business needs, conceptualizing solution options, and finalizing strategies with stakeholders. You will also be involved in team building, delivering Proof of Concepts (PoCs), and enhancing competencies within the organization. Your role will focus on building competencies in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, and Data Governance. Staying updated with the latest tools, best practices, and trends in the Data and Analytics field will be essential to drive innovation and excellence in your work. To excel in this position, you should hold a Bachelor's or Master's degree in a Software discipline and have extensive experience in Data architecture and implementing large-scale Data Lake/Data Warehousing solutions. Your background in Data Engineering should demonstrate leadership in solutioning, architecture, and successful project delivery. Strong communication skills in English, both written and verbal, are essential for effective collaboration with clients and team members. Proficiency in tools such as AWS Glue, Redshift, Azure Data Lake, Databricks, Snowflake, and databases, along with programming skills in Spark, Spark SQL, PySpark, and Python, are mandatory competencies for this role. Joining Iris Software offers a range of perks and benefits designed to support your financial, health, and overall well-being. From comprehensive health insurance and competitive salaries to flexible work arrangements and continuous learning opportunities, we are dedicated to providing a supportive and rewarding work environment where your success and happiness are valued. If you are inspired to grow your career in Data Engineering and thrive in a culture that values talent and personal growth, Iris Software is the place for you. Be part of a dynamic team where you can be valued, inspired, and encouraged to be your best professional and personal self.,

Posted 2 days ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Amgen Amgen harnesses the best of biology and technology to fight the world's toughest diseases, and make people's lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what fs known today. About The Role Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect From You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 4 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting As an Associate Data Engineer at Amgen, you will be involved in the development and maintenance of data infrastructure and solutions. You will collaborate with a team of data engineers to design and implement data pipelines, perform data analysis, and ensure data quality. Your strong technical skills, problem-solving abilities, and attention to detail will contribute to the effective management and utilization of data for insights and decision-making.

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience :5-8 yrs Responsible for building agentic workflows using modern LLM orchestration framework to automate and optimize xomplex business process in the Travel domain. Individual contributor (IC), owning end to end development of intelligent agents and services that power customer experiences, recommendations and backend automation. Design and implement agentic and autonomous workflows using framework such as LangGraph, LangChain and CrewAI. Translate business problems in the Travel domain into intelligent LLM powered workflows. Own at least two AI use case implementation from design to production deployment. Build and expose RESTFul and GraphQL APIs to support internal and external consumers. Develop and maintain robust Python based microservices using FastAPI or Django. Collaborate with product managers, data engineers and backend teams to design seamless AI driven user experience. Deploy and maintain workflow and APIs on AWS with best practices in scalability and security. Nice to have Experience in a Big Data technologies (Hadoop, Terradata, Snowflake, Spark, Redshift, Kafka, etc.) for Data Processing. Experience with data management process on AWS is a huge Plus. AWS certification Hand on experience building applications with LangGraph, LangChain and CrewAI. Experience working with AWS services - Lambda, API Gateway, S3, ECS, DynamoDB Experience and Proven track record of implementing at least two AI / LLM based use cases in production. Strong problem solving skills with the ability to deconstract complex problems into actionable AI workflows Experience building scalable, production-grade APIs using FastAPI or Django Strong command over Python and software engineering best practices Solid understanding of multithreading, IO operations, and scalability patterns in backend systems

Posted 3 days ago

Apply

8.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a highly skilled and experienced Senior Power BI Developer to join our dynamic data and analytics team. The ideal candidate should have strong technical skills in Power BI , responsible for designing, developing, and implementing robust and insightful business intelligence solutions using Power BI . This role requires a deep understanding of BI concepts, strong SQL skills, and extensive experience with various AWS services like Redshift and RDS Postgres Sql. You will play a crucial role in translating complex business requirements into clear, interactive, and high-performance dashboards and reports, driving data-driven decision-making across the organization. Responsibilities **Power BI Development & Design:** Lead the design, development, and implementation of complex, interactive, and user-friendly dashboards and reports using Power BI. Translate diverse business requirements into technical specifications and impactful data visualizations. Develop and optimize Power BI datasets, analyses, and dashboards for performance, scalability, and maintainability. Implement advanced Power BI features such as rbac ,parameters, calculated fields, custom visuals, and dynamic filtering. Ensure data accuracy, consistency, and integrity within all Power BI reports and dashboards. **Performance Optimization & Governance:** Identify and address performance bottlenecks in Power BI dashboards and underlying data sources. Implement best practices for POwer BI security, user access, and data governance. Monitor Power BI usage and performance, recommending improvements as needed. Ensure compliance with data security policies and governance guidelines when handling sensitive data within Power BI. **Continuous Improvement:** Stay up-to-date with the latest features, releases, and best practices in Power BI. Proactively identify opportunities for process improvement and automation in BI development workflows. Work in an Agile/Scrum environment, actively participating in sprint planning, reviews, and retrospectives. Required Skills And Qualifications Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. 8-10 years of overall IT experience, with 4-5 years of hands-on experience designing and developing complex dashboards and reports using Power BI. Strong proficiency in SQL (writing complex queries, stored procedures, functions, DDL). In-depth understanding of BI. Extensive experience with various AWS services relevant to data analytics, including: Redshift (data warehouse) RDS (relational databases) Proven ability to translate business requirements into technical solutions and effective data visualizations. Excellent analytical, problem-solving, and critical thinking skills. Strong communication and interpersonal skills, with the ability to effectively collaborate with technical and non-technical stakeholders. Experience working in an Agile development methodology. Ability to work independently, manage multiple priorities, and meet tight deadlines. Preferred Skills (Nice To Have) Experience with other BI tools (e.g., Tableau etc) is a plus, demonstrating a broad understanding of BI landscape. Proficiency in Python or other scripting languages for data manipulation and automation.

Posted 3 days ago

Apply

3.0 - 4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3-4 years of hands-on experience in data engineering, with a strong focus on AWS cloud services. Proficiency in Python for data manipulation, scripting, and automation. Strong command of SQL for data querying, transformation, and database management. Demonstrable Experience With AWS Data Services, Including Amazon S3: Data Lake storage and management. AWS Glue: ETL service for data preparation. Amazon Redshift: Cloud data warehousing. AWS Lambda: Serverless computing for data processing. Amazon EMR: Managed Hadoop framework for big data processing (Spark/PySpark experience highly preferred). AWS Kinesis (or Kafka): Real-time data streaming. Strong analytical, problem-solving, and debugging skills. Excellent communication and collaboration abilities, with the capacity to work effectively in an agile team environment. Responsibilities Troubleshoot and resolve data-related issues and performance bottlenecks in existing pipelines. Develop and maintain data quality checks, monitoring, and alerting mechanisms to ensure data pipeline reliability. Participate in code reviews, contribute to architectural discussions, and promote best practices in data engineering.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job Overview And Responsibilities United Airlines is seeking talented people to join the Data Engineering Operations team. Key responsibilities include configuring and managing infrastructure, implementing continuous integration/continuous deployment (CI/CD) pipelines, and optimizing system performance. You will work to improve efficiency, enhance scalability, and ensure the reliability of systems through monitoring and proactive measures. Collaboration, scripting, and proficiency in tools for version control and automation are critical skills for success in this role. We are seeking creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights. Individuals who have a natural curiosity and desire to solve problems are encouraged to apply . Collaboration, scripting, and proficiency in tools for version control and automation are critical skills for success in this role. Translate product strategy and requirements into suitable, maintainable and scalable solution design according to existing architecture guardrails Collaborate with development and operations teams to understand project requirements and design effective DevOps solutions Implement and maintain CI/CD pipelines for automated software builds, testing, and deployment Manage and optimize cloud-based infrastructure to ensure scalability, security, and performance Implement and maintain monitoring and alerting systems for proactive issue resolution Work closely with cross-functional teams to troubleshoot and resolve infrastructure-related issues Automate repetitive tasks and processes to improve efficiency and reduce manual intervention Key Responsibilities Design, deploy, and maintain cloud infrastructure on AWS. Set up and manage Kubernetes clusters for container orchestration. Design, implement, and manage scalable, secure, and highly available AWS infrastructure using Terraform. Develop and manage Infrastructure as Code (IaC) modules and reusable components. Collaborate with developers, architects, and other DevOps engineers to design cloud-native applications and deployment strategies. Manage and optimize CI/CD pipelines using tools like GitHub Actions, GitLab CI, Jenkins, or similar. Manage and optimize Databricks platform. Monitor infrastructure health and performance using AWS CloudWatch, Prometheus, Grafana, etc. Ensure cloud security best practices, including IAM policies, VPC configurations, data encryption, and secrets management. Create and manage networking infrastructure such as VPCs, subnets, security groups, route tables, NAT gateways, etc. Handle deployment and configuration of services such as EC2, RDS, Glue, S3, ECS/EKS, Lambda, API Gateway, Kinesis, MWAA, DynamoDB, CloudFront, Route 53, SQS,SNS, Athena, ELB/ALB. Maintain logging, alerting, and monitoring systems to ensure reliability and performance. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree in Computer Science, Engineering, or related field 5+ years of IT experience in Experience as a DevOps Engineer or in a similar role. Experience with AWS infrastructure designs, implementation, and support Proficiency in scripting languages (e.g., Bash, Python) and configuration management tools Experience with database systems like Postgress, Redshift, Mysql. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master’s in computer science or related STEM field Strong experience with continuous integration & delivery using Agile methodologies DevOps experience with transportation/airline industry Knowledge of security best practices in a DevOps environment Experience with logging and monitoring tools (e.g., Dynatrace / Datadog ) Strong problem-solving and communication skills Experience with Harness tools Experience with microservices architecture and serverless applications. Knowledge of database technologies (PostgreSQL, Redshift,Mysql). Knowledge of security best practices in a DevOps environment AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). Databricks Platform certifications.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job Overview And Responsibilities This role will be responsible for collaborating with the Business and IT teams to identify the value, scope, features and delivery roadmap for data engineering products and solutions. Responsible for communicating with stakeholders across the board, including customers, business managers, and the development team to make sure the goals are clear and the vision is aligned with business objectives. Perform data analysis using SQL Data Quality Analysis, Data Profiling and Summary reports Trend Analysis and Dashboard Creation based on Visualization technique Execute the assigned projects/ analysis as per the agreed timelines and with accuracy and quality. Complete analysis as required and document results and formally present findings to management Perform ETL workflow analysis, create current/future state data flow diagrams and help the team assess the business impact of any changes or enhancements Understand the existing Python code work books and write pseudo codes Collaborate with key stakeholders to identify the business case/value and create documentation. Should have excellent communication and analytical skills. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications - External Required BE, BTECH or equivalent, in computer science or related STEM field 5+ years of total IT experience as either a Data Analyst/Business Data Analyst or as a Data Engineer 2+ years of experience with Big Data technologies like PySpark, Hadoop, Redshift etc. 3+ years of experience with writing SQL queries on RDBMS or Cloud based database Experience with Visualization tools such as Spotfire, PowerBI, Quicksight etc Experience in Data Analysis and Requirements Gathering Strong problem-solving skills Creative, driven, detail-oriented focus, requiring tackling of tough problems with data and insights. Natural curiosity and desire to solve problems. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Qualifications Preferred AWS Certification preferred Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry

Posted 3 days ago

Apply

0.0 - 2.0 years

3 - 10 Lacs

Niranjanpur, Indore, Madhya Pradesh

Remote

Job Title - Sr. Data Engineer Experience - 2+ Years Location - Indpre (onsite) Industry - IT Job Type - Full ime Roles and Responsibilities- 1. Design and develop scalable data pipelines and workflows for data ingestion, transformation, and integration. 2. Build and maintain data storage systems, including data warehouses, data lakes, and relational databases. 3. Ensure data accuracy, integrity, and consistency through validation and quality assurance processes. 4. Collaborate with data scientists, analysts, and business teams to understand data needs and deliver tailored solutions. 5. Optimize database performance and manage large-scale datasets for efficient processing. 6. Leverage cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Kafka) for building robust data solutions. 7. Automate and monitor data workflows using orchestration frameworks such as Apache Airflow. 8. Implement and enforce data governance policies to ensure compliance and data security. 9. Troubleshoot and resolve data-related issues to maintain seamless operations. 10. Stay updated on emerging tools, technologies, and trends in data engineering. Skills and Knowledge- 1. Core Skills: ● Proficient in Python (libraries: Pandas, NumPy) and SQL. ● Knowledge of data modeling techniques, including: ○ Entity-Relationship (ER) Diagrams ○ Dimensional Modeling ○ Data Normalization ● Familiarity with ETL processes and tools like: ○ Azure Data Factory (ADF) ○ SSIS (SQL Server Integration Services) 2. Cloud Expertise: ● AWS Services: Glue, Redshift, Lambda, EKS, RDS, Athena ● Azure Services: Databricks, Key Vault, ADLS Gen2, ADF, Azure SQL ● Snowflake 3. Big Data and Workflow Automation: ● Hands-on experience with big data technologies like Hadoop, Spark, and Kafka. ● Experience with workflow automation tools like Apache Airflow (or similar). Qualifications and Requirements- ● Education: ○ Bachelor’s degree (or equivalent) in Computer Science, Information Technology, Engineering, or a related field. ● Experience: ○ Freshers with strong understanding, internships and relevant academic projects are welcome. ○ 2+ years of experience working with Python, SQL, and data integration or visualization tools is preferred. ● Other Skills: ○ Strong communication skills, especially the ability to explain technical concepts to non-technical stakeholders. ○ Ability to work in a dynamic, research-oriented team with concurrent projects. Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹1,000,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 2 years (Preferred) Work Location: In person Application Deadline: 31/08/2025

Posted 3 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

LotusFlare is a provider of cloud-native SaaS products based in the heart of Silicon Valley. Founded by the team that helped Facebook reach over one billion users, LotusFlare was founded to make affordable mobile communications available to everyone on Earth. Today, LotusFlare focuses on designing, building, and continuously evolving a digital commerce and monetization platform that delivers valuable outcomes for enterprises. Our platform, Digital Network Operator® (DNO™) Cloud, is licensed to telecommunications services providers and supports millions of customers globally. LotusFlare has also designed and built the leading eSIM travel product - Nomad. Nomad provides global travelers with high-speed, affordable data connectivity in over 190 countries. Nomad is available as an iOS or Android app or via getnomad.app. Description: DevOps Support Engineers at LotusFlare guarantee the quality of the software solutions produced by LotusFlare by monitoring, responding to incidents and testing and quality checking. With shared accountability and code ownership, DevOps Support Engineers take on-call responsibilities and incident management work. Through these activities, LoutsFlare developers write code that better fits into their applications and infrastructure, helping them proactively deepen the reliability of services being deployed. DevOps Support Engineers will test the functionality of the code to bring out every flaw and to improve on the underperformance of every standalone feature. This role is always on the lookout for opportunities to improve any and every feature to bring customer satisfaction. Partnered alongside the best engineers in the industry on the coolest stuff around, the code and systems you work on will be in production and used by millions of users all around the world. Our team comprises engineers with varying levels of experience and backgrounds, from new grads to industry veterans. Relevant industry experience is important (Site Reliability Engineer (SRE), Systems Engineer, Software Engineer, DevOps Engineer, Network Engineer, Systems Administrator, Linux Administrator, Database Administrator, or similar role), but ultimately less so than your demonstrated abilities and attitude. Responsibilities: Monitoring backend services (cloud-based infrastructure) Supporting, troubleshooting, and investigating issues and incidents (support developers and infra team with system metrics analysis, logs, traffic, configuration, deployment changes, etc) Supporting and improving monitoring/alerting systems (Searching, testing, deploying new functionality for existing tools) Creating new features for automating troubleshooting and investigation process Creating new tools to improve the support process Drafting reports and summarizing information after investigations and incidents Requirements: At least 3+ year of work experience with similar responsibilities Strong knowledge and practical experience in working with the Linux(Ubuntu) command-line/administration Understanding of network protocols and troubleshooting (TCP/IP, UDP) Strong scripting skills (Bash, Python) Critical thinking and problem solving Understanding of containerization (Docker, container) Experience with troubleshooting API driven services Experience with Kubernetes Experience with Git Background in release management processes English — Professional written and verbal skills Good to have: Prometheus, Grafana, Kibana (Query language) Experience with Nginx/OpenResty Experience with telco protocols (Camel, Map, Diameter) from advantage Software development/scripting skills Basic knowledge Casandra, PostgreSQL Experience with using AWS cloud services (EC2, Redshift, S3, RDS, ELB/ALB, ElastiCache, Direct Connect, Route 53, Elastic IPs, etc.) CI/CD: Jenkins Terraform Benefits: night shift allowance 1K per day company sponsored lunch and dinner certification and learning working with a cross-functional team. Recruitment Process: HR Interview followed by 4-5 Levels of Technical Interviews Work Model: Work from Office Location : Baner, Pune About: At LotusFlare, we attract and keep amazing people by offering two key things: Purposeful Work: Every team member sees how their efforts make a tangible, positive difference for our customers and partners. Growth Opportunities: We provide the chance to develop professionally while mastering cutting-edge practices in cloud-native enterprise software. From the beginning, our mission has been to simplify technology to create better experiences for customers. Using an “experience down” approach, which prioritizes the customer's journey at every stage of development, our Digital Network Operator™ Cloud empowers communication service providers to achieve valuable business outcomes. DNO Cloud enables communication service providers to innovate freely, reduce operational costs, monetize network assets, engage customers on all digital channels, drive customer acquisition, and increase retention. With headquarters in Santa Clara, California, and five major offices worldwide, LotusFlare serves Deutsche Telekom, T-Mobile, A1, Globe Telecom, Liberty Latin America, Singtel, and other leading enterprises around the world. Website: www.lotusflare.com LinkedIn: https://www.linkedin.com/company/lotusflare Instagram: https://www.instagram.com/lifeatlotusflare/ Twitter: https://twitter.com/lotus_flare

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description As a Data Engineer on the Data and AI team, you will design and implement robust data pipelines and infrastructure that power our organization's data-driven decisions and AI capabilities. This role is critical in developing and maintaining our enterprise-scale data processing systems that handle high-volume transactions while ensuring data security, privacy compliance, and optimal performance. You'll be part of a dynamic team that designs and implements comprehensive data solutions, from real-time processing architectures to secure storage solutions and privacy-compliant data access layers. The role involves close collaboration with cross-functional teams, including software development engineers, product managers, and scientists, to create data products that power critical business capabilities. You'll have the opportunity to work with leading technologies in cloud computing, big data processing, and machine learning infrastructure, while contributing to the development of robust data governance frameworks. If you're passionate about solving complex technical challenges in high-scale environments, thrive in a collaborative team setting, and want to make a lasting impact on our organization's data infrastructure, this role offers an exciting opportunity to shape the future of our data and AI capabilities. Key job responsibilities Design and implement ETL/ELT frameworks that handle large-scale data operations, while building reusable components for data ingestion, transformation, and orchestration while ensuring data quality and reliability. Establish and maintain robust data governance standards by implementing comprehensive security controls, access management frameworks, and privacy-compliant architectures that safeguard sensitive information. Drive the implementation of data solutions, both real-time and batch, optimizing them for both analytical workloads and AI/ML applications. Lead technical design reviews and provide mentorship on data engineering best practices, identifying opportunities for architectural improvements and guiding the implementation of enhanced solutions. Build data quality frameworks with robust monitoring systems and validation processes to ensure data accuracy and reliability throughout the data lifecycle. Drive continuous improvement initiatives by evaluating and implementing new technologies and methodologies that enhance data infrastructure capabilities and operational efficiency. A day in the life The day often begins with a team stand-up to align priorities, followed by a review of data pipeline monitoring alarms to address any processing issues and ensure data quality standards are maintained across systems. Throughout the day, you'll find yourself immersed in various technical tasks, including developing and optimizing ETL/ELT processes, implementing data governance controls, and reviewing code for data processing systems. You'll work closely with software engineers, scientists, and product managers, participating in technical design discussions and sharing your expertise in data architecture and engineering best practices. Your responsibilities extend to communicating with non-technical stakeholders, explaining data-related projects and their business impact. You'll also mentor junior engineers and contribute to maintaining comprehensive technical documentation. You'll troubleshoot issues that arise in the data infrastructure, optimize the performance of data pipelines, and ensure data security and compliance with relevant regulations. Staying updated on the latest data engineering technologies and best practices is crucial, as you'll be expected to incorporate new learnings into your work. By the end of a typical day, you'll have advanced key data infrastructure initiatives, solved complex technical challenges, and improved the reliability, efficiency, and security of data systems. Whether it's implementing new data governance controls, optimizing data processing workflows, or enhancing data platforms to support new AI models, your work directly impacts the organization's ability to leverage data for critical business decisions and AI capabilities. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! About The Team The Data and Artificial Intelligence (AI) team is a new function within Customer Engagement Technology. We own the end-to-end process of defining, building, implementing, and monitoring a comprehensive data strategy. We also develop and apply Generative Artificial Intelligence (GenAI), Machine Learning (ML), Ontology, and Natural Language Processing (NLP) to customer and associate experiences. Basic Qualifications 3+ years of data engineering experience Bachelor’s degree in Computer Science, Engineering, or a related technical discipline Preferred Qualifications Experience with AWS data services (Redshift, S3, Glue, EMR, Kinesis, Lambda, RDS) and understanding of IAM security frameworks Proficiency in designing and implementing logical data models that drive physical designs Hands-on experience working with large language models, including understanding of data infrastructure requirements for AI model training Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2996966

Posted 3 days ago

Apply

6.0 years

6 - 9 Lacs

Hyderābād

On-site

CACI International Inc is an American multinational professional services and information technology company headquartered in Northern Virginia. CACI provides expertise and technology to enterprise and mission customers in support of national security missions and government transformation for defense, intelligence, and civilian customers. CACI has approximately 23,000 employees worldwide. Headquartered in London, CACI Ltd is a wholly owned subsidiary of CACI International Inc., a publicly listed company on the NYSE with annual revenue in excess of US $6.2bn. Founded in 2022, CACI India is an exciting, growing and progressive business unit of CACI Ltd. CACI Ltd currently has over 2000 intelligent professionals and are now adding many more from our Hyderabad and Pune offices. Through a rigorous emphasis on quality, the CACI India has grown considerably to become one of the UKs most well-respected Technology centres. About Data Platform: The Data Platform will be built and managed “as a Product” to support a Data Mesh organization. The Data Platform focusses on enabling decentralized management, processing, analysis and delivery of data, while enforcing corporate wide federated governance on data, and project environments across business domains. The goal is to empower multiple teams to create and manage high integrity data and data products that are analytics and AI ready, and consumed internally and externally. What does a Data Infrastructure Engineer do? A Data Infrastructure Engineer will be responsible to develop, maintain and monitor the data platform infrastructure and operations. The infrastructure and pipelines you build will support data processing, data analytics, data science and data management across the CACI business. The data platform infrastructure will conform to a zero trust, least privilege architecture, with a strict adherence to data and infrastructure governance and control in a multi-account, multi-region AWS environment. You will use Infrastructure as Code and CI/CD to continuously improve, evolve and repair the platform. You will be able to design architectures and create re-useable solutions to reflect the business needs. Responsibilities will include: Collaborating across CACI departments to develop and maintain the data platform Building infrastructure and data architectures in Cloud Formation, and SAM. Designing and implementing data processing environments and integrations using AWS PaaS such as Glue, EMR, Sagemaker, Redshift, Aurora and Snowflake Building data processing and analytics pipelines as code, using python, SQL, PySpark, spark, CloudFormation, lambda, step functions, Apache Airflow Monitoring and reporting on the data platform performance, usage and security Designing and applying security and access control architectures to secure sensitive data You will have: 6+ years of experience in a Data Engineering role. Strong experience and knowledge of data architectures implemented in AWS using native AWS services such as S3, DataZone, Glue, EMR, Sagemaker, Aurora and Redshift. Experience administrating databases and data platforms Good coding discipline in terms of style, structure, versioning, documentation and unit tests Strong proficiency in Cloud Formation, Python and SQL Knowledge and experience of relational databases such as Postgres, Redshift Experience using Git for code versioning, and lifecycle management Experience operating to Agile principles and ceremonies Hands-on experience with CI/CD tools such as GitLab Strong problem-solving skills and ability to work independently or in a team environment. Excellent communication and collaboration skills. A keen eye for detail, and a passion for accuracy and correctness in numbers Whilst not essential, the following skills would also be useful: Experience using Jira, or other agile project management and issue tracking software Experience with Snowflake Experience with Spatial Data Processing

Posted 3 days ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 3 days ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling At Amazon, our goal is to be earth’s most customer-centric company and to create a safe environment for both our customers and our associates. To achieve that, we need exceptionally talented, bright, dynamic, and driven people. If you'd like to help us build the place to find and buy anything online, this is your chance to make history. We are looking for a talented Business Intelligence Engineer to join the Trustworthy Shopping Experience Operations Analytics Team. Key job responsibilities - Design and implement scalable data architecture and analytics pipelines using AWS services, creating robust ETL processes and optimized data models to support enterprise reporting needs - Build automated, self-service reporting capabilities and dashboards using BI tools like QuickSight, enabling stakeholders to independently access insights and address business questions - Partner with cross-functional teams (BAs, Data Engineers, Product Managers) to gather requirements, translate business needs into technical solutions, and deliver high-quality data products - Develop and optimize complex SQL queries and stored procedures while implementing data quality frameworks to ensure accuracy and reliability of analytics solutions - Conduct advanced statistical analysis and create analytical models to uncover trends, develop insights, and support data-driven decision making - Identify and implement process improvements through automation, code optimization, and integration of generative AI capabilities to enhance BI processes and reporting efficiency - Lead technical discussions, provide mentorship to junior team members, and present solutions to stakeholders while staying current with emerging technologies and best practices A day in the life As a Business Intelligence Engineer in TSE Operations Analytics, you'll develop analytical solutions to provide insights into support operations and measure global technical support initiatives. You'll transform operational data into actionable insights for TSE leaders and engineers worldwide, managing critical datasets for support metrics, productivity, and customer satisfaction. Key responsibilities include building real-time dashboards, automated reporting systems, and data quality frameworks. You'll work with stakeholders to implement analytics solutions and drive data-driven decision-making. The role involves optimizing query performance and translating findings into recommendations to improve support efficiency and customer experience. About the team We are a dynamic analytics team within TSE (Trustworthy Shopping Experience) Operations, combining Business Intelligence Engineers, Data Engineers, and Business Analysts. Our mission is twofold: protecting customers from unsafe or non-compliant products while enabling sellers to grow their businesses confidently on Amazon. We build scalable BI solutions and data-driven insights that streamline compliance processes, improve operational efficiency, and enhance the seller experience. Our team focuses on delivering analytical solutions that drive operational excellence, uncover emerging risks, reduce investigation errors, and optimize the customer-seller trust ecosystem. Through advanced analytics and BI tools, we help shape the future of Amazon's trustworthy shopping experience. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderābād

On-site

Growth & Purpose (G&P) – Technology CoE – Adobe Experience Manager (AEP) – Data Engineer Do you thrive on developing creative and innovative insights to solve complex challenges? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? If yes, we need your expertise! About Technology CoE Technology CoE offers innovative solutions that enable Growth & Purpose to execute with agility, drive closed-looped marketing, and achieve operational excellence; optimizes data, tools, people, and processes. Partners with various CMG channels to deliver data-driven insights in the form of dashboards/reports, which help make informed decisions. Provides end-to-end support around database management, reporting, and maintenance activities. Proactively identifies opportunities for automation and reporting. This role supports the Product Owner for Deloitte’s instance of Adobe Experience Platform Real-Time CDP and Customer Journey Analytics. It focuses primarily on data management, data transformation, process development, and proactive approaches to data quality. Working cross-functionally with other data and business teams, this role will own workflows and gain exposure to other platforms and tools. Data Integration and Management Work with PO/architect to design, develop, and maintain scalable data pipelines to ingest, process, and store large volumes of customer data from various sources. Implement ETL (Extract, Transform, Load) processes to ensure data accuracy, consistency, and reliability. Identify opportunities for process improvements and automation to enhance data operations efficiency. Data Platform Development Consult on building robust data infrastructure to support the customer data platform, ensuring high availability and performance. Recommend optimization on source/target data models and schemas to support analytics and reporting needs. Identification of data governance and security measures to protect sensitive customer information. Data Quality and Monitoring Help establish and enforce data quality standards and best practices. Collaborate on development and maintenance of data validation and monitoring frameworks to detect and resolve data issues proactively. Perform data audits and design corrective actions to maintain data integrity. Collaboration and Support Ability to work in an agile environment and manage user story lifecycles. Participate in code reviews, knowledge sharing sessions, and continuous improvement initiatives. Work closely with members of the team to support business initiatives and data onboarding. Help generate and maintain documentation that is shared with other business and technical teams, so that they can understand available data, workflows, and processes. Required experience Experience with Customer Data Platforms (CDP’s) like Adobe RT-CDP, Amplitude, Treasure Data, etc. General understanding of marketing practices, technologies, and common platforms i.e. email, social platforms, paid media, SEO techniques, etc. Experience managing data at scale and working across teams to enable technical marketing solutions/processes Demonstrated knowledge of metadata management concepts and practical application Demonstrated competency in a variety of Data Transport and ETL processes and methodologies Basic understanding of SQL Databases including basic SQL query management and optimization skills Understanding of public cloud offerings and how-to solution designs that are built around these technologies (AWS, Azure, Google Cloud Platform, and similar) Preferred experience Familiarity with the Adobe MarTech stack including AEP, CJA, Target, AEM, and WebSDK Adobe Certifications in AEP, RT-CDP, AJO, CJA, or AA Experience with data deep dives for quality, consistency, and operational readiness Ability to translate requests from non-technical stakeholders into operational steps and successful outcomes Educational Requirements: Tech/B.E. Professional qualification (reputed institutes preferred): MBA (a plus) 2+ years of experience in a technical role A degree in a technical field (e.g. Mathematics, Computer Science, Information Systems) or equivalent experience Hands-on experience with Python and SQL Technical expertise with data warehouse or data Lake Hands on experience with high-dimensional, large datasets Experience performance tuning queries and data models to produce the best execution plan. Experience building data pipelines & ETL. Experience working on an Agile Development team and delivering features incrementally. Experience with Git repositories Working knowledge of setting up builds and deployments Experience demonstrating work to peers and stakeholders for acceptance Strong communication, interpersonal, analytical and problem-solving skills. Ability to communicate effectively with nontechnical stakeholders to define requirements. Ability to quickly understand new client data environments and document the business logic that composes them. Ability to integrate oneself into geographically dispersed teams and clients. A passion for high quality software. Previous experience as a data engineer or in a similar role Eagerness to learn and seek new frameworks, technologies, and languages Ability to navigate shifting priorities and work with grace in a deadline-driven environment Attention to detail and exceptional follow-up skills Software: Microsoft Office 365 including Teams Preferred Experience working with Azure DevOps, JIRA or similar project tracking software. Experience working in a startup environment Experience with many other big data technologies at scale. Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases) Cloud development and implementation experience (Amazon Web Services - AWS, Google Cloud Platform - GCP, Microsoft Azure) Recruiting tips From developing a stand-out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our diverse, equitable, and inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our client most complex challenges. This makes Deloitte one of the most rewarding places to work. Learn more about our inclusive culture. Our purpose Deloitte’s purpose is to make an impact that matters for our clients, our people, and in our communities. We are creating trust and confidence in a more equitable society. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. We are focusing our collective efforts to advance sustainability, equity, and trust that come to life through our core commitments. Learn more about Deloitte's purpose, commitments, and impact. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 307769

Posted 3 days ago

Apply

6.0 years

15 - 18 Lacs

Indore

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 3 days ago

Apply

4.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Data Engineer (AWS QuickSight, Glue, PySpark) Location: Noida Job Summary: We are seeking a skilled Data Engineer with 4-5 years of experience to design, build, and maintain scalable data pipelines and analytics solutions within the AWS cloud environment. The ideal candidate will leverage AWS Glue, PySpark, and QuickSight to deliver robust data integration, transformation, and visualization capabilities. This role is critical in supporting business intelligence, analytics, and reporting needs across the organization. Key Responsibilities: Design, develop, and maintain data pipelines using AWS Glue, PySpark, and related AWS services to extract, transform, and load (ETL) data from diverse sources Build and optimize data warehouse/data lake infrastructure on AWS, ensuring efficient data storage, processing, and retrieval Develop and manage ETL processes to source data from various systems, including databases, APIs, and file storage, and create unified data models for analytics and reporting Implement and maintain business intelligence dashboards using Amazon QuickSight, enabling stakeholders to derive actionable insights Collaborate with cross-functional teams (business analysts, data scientists, product managers) to understand requirements and deliver scalable data solutions Ensure data quality, integrity, and security throughout the data lifecycle, implementing best practices for governance and compliance5. Support self-service analytics by empowering internal users to access and analyze data through QuickSight and other reporting tools1. Troubleshoot and resolve data pipeline issues , optimizing performance and reliability as needed Required Skills & Qualifications: Proficiency in AWS cloud services: AWS Glue, QuickSight, S3, Lambda, Athena, Redshift, EMR, and related technologies Strong experience with PySpark for large-scale data processing and transformation Expertise in SQL and data modeling for relational and non-relational databases Experience building and optimizing ETL pipelines and data integration workflows Familiarity with business intelligence and visualization tools , especially Amazon QuickSight Knowledge of data governance, security, and compliance best practices 5. Strong programming skills in Python ; experience with automation and scripting Ability to work collaboratively in agile environments and manage multiple priorities effectively Excellent problem-solving and communication skills . Preferred Qualifications: AWS certification (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Developer) Good to have skills - understanding of machine learning , deep learning and Generative AI concepts, Regression, Classification, Predictive modeling, Clustering, Deep Learning

Posted 3 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Delhi, India

On-site

What do you need to know about us? M+C Saatchi Performance is an award-winning global digital media agency, connecting brands to people. We deliver business growth for our clients through effective, measurable, and evolving digital media strategies. Position Title : Analyst- Reporting & QA Department : Reporting & QA Location : New Delhi - Hybrid About the Role: We are looking for a highly skilled Analyst- Reporting & QA with a deep understanding of digital and mobile media to join our Reporting and QA team. This role will focus on enabling our clients to meet their media goals by ensuring data accuracy and delivering actionable insights into media performance through our reporting tools. The ideal candidate will have strong technical skills, be detail-oriented, and have experience in digital/mobile media attribution and reporting. Core Responsibilities: ETL & Data Automation: Use Matillion to streamline data processes, ensuring efficient and reliable data integration across all reporting systems. Data Quality Assurance: Verify and validate data accuracy within Power BI dashboards, proactively identifying and addressing discrepancies to maintain high data integrity. Dashboard Development: Build, maintain, and optimize Power BI dashboards to deliver real-time insights that help clients understand the performance of their digital and mobile media campaigns. Media Performance Insights: Collaborate closely with media teams to interpret data, uncover trends, and provide actionable insights that support clients in optimizing their media investments. Industry Expertise: Apply in-depth knowledge of digital and mobile media, attribution models, and reporting frameworks to deliver valuable perspectives on media performance. Tools & Platforms Expertise: Utilize tools such as GA4, platform reporting systems, first-party data analytics, and mobile measurement partners (MMPs) to support comprehensive media insights for clients. Qualifications and Experience: Education: Bachelor’s degree in Statistics, Data Science, Computer Science, Marketing, or a related field. Experience: 4-6 years in a similar role, with substantial exposure to data analysis, reporting, and the digital/mobile media landscape. Technical Skills: Proficiency in ETL tools (preferably Matillion), Power BI, and data quality control. Industry Knowledge: Strong understanding of digital and mobile media, with familiarity in attribution, reporting practices, and performance metrics. Analytical Skills: Skilled in interpreting complex data, generating actionable insights, and presenting findings effectively to non-technical stakeholders. Communication: Excellent communicator with a proven ability to collaborate effectively across cross-functional teams and with clients. Tools & Platforms: Proficiency in GA4, platform reporting, first-party data analysis, and mobile measurement partners (MMPs). Desired Skills: Background in a media agency environment. Experience with cloud-based data platforms (e.g., AWS, Redshift) preferred. Experience with Power BI is must. Strong collaboration skills and the ability to work independently. What Can You Look Forward To Being a part of the world’s largest independent advertising holding group. Family Health Insurance Coverage. Flexible Working Hours. Regular events including Reece Lunch & indoor games. Employee Training/Learning Programs About M+C Saatchi Performance M+C Saatchi Performance has pledged its commitment to create a company that values difference, with an inclusive culture. As part of this, M+C Saatchi Performance continues to be an Equal Opportunity Employer which does not and shall not discriminate, celebrates diversity and bases all hiring and promotion decisions solely on merit, without regard for any personal characteristics. All employee information is kept confidential according to General Data Protection Regulation (GDPR). M+C Saatchi Group was founded in 1995 and is now the biggest Independent creative agency group in the World. Founded on one core principle, Brutal Simplicity.

Posted 3 days ago

Apply

0.0 - 15.0 years

83 - 104 Lacs

Delhi, Delhi

On-site

Job Title: Data Architect (Leadership Role) Company : Wingify Location : Delhi (Outstation Candidates Allowed) Experience Required : 10 – 15 years Working Days : 5 days/week Budget : 83 Lakh to 1.04 Cr About Us We are a fast-growing product-based tech company known for its flagship product VWO—a widely adopted A/B testing platform used by over 4,000 businesses globally, including Target, Disney, Sears, and Tinkoff Bank. The team is self-organizing, highly creative, and passionate about data, tech, and continuous innovation. About us Company Size: Mid-Sized Industry : Consumer Internet, Technology, Consulting Role & Responsibilities Lead and mentor a team of Data Engineers, ensuring performance and career development. Architect scalable and reliable data infrastructure with high availability. Define and implement data governance frameworks, compliance, and best practices. Collaborate cross-functionally to execute the organization’s data roadmap. Optimize data processing workflows for scalability and cost efficiency. Ensure data quality, privacy, and security across platforms. Drive innovation and technical excellence across the data engineering function. Ideal Candidate Must-Haves Experience : 10+ years in software/data engineering roles. At least 2–3+ years in a leadership role managing teams of 5+ Data Engineers. Proven hands-on experience setting up data engineering systems from scratch (0 → 1 stage) in high-growth B2B product companies. Technical Expertise: Strong in Java (preferred), or Python, Node.js, GoLang. Expertise in big data tools: Apache Spark, Kafka, Hadoop, Hive, Airflow, Presto, HDFS. Strong design experience in High-Level Design (HLD) and Low-Level Design (LLD). Backend frameworks like Spring Boot, Google Guice. Cloud data platforms: AWS, GCP, Azure. Familiarity with data warehousing: Snowflake, Redshift, BigQuery. Databases: Redis, Cassandra, MongoDB, TiDB. DevOps tools: Jenkins, Docker, Kubernetes, Ansible, Chef, Grafana, ELK. Other Skills: Strong understanding of data governance, security, and compliance (GDPR, SOC2, etc.). Proven strategic thinking with ability to align technical architecture to business objectives. Excellent communication, leadership, and stakeholder management. Preferred Qualifications Exposure to Machine Learning infrastructure / MLOps. Experience with real-time data analytics. Strong foundation in algorithms, data structures, and scalable systems. Previous work in SaaS or high-growth startups. Screening Questions Do you have team leadership experience? How many engineers have you led? Have you built a data engineering platform from scratch? Describe the setup. What’s the largest data scale you’ve worked with and where? Are you open to continuing hands-on coding in this role? Interested candidates applies on deepak.visko@gmail.com or 9238142824 . Job Types: Full-time, Permanent Pay: ₹8,300,000.00 - ₹10,400,000.00 per year Work Location: In person

Posted 3 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Data Engineer - Senior Location: Noida Employment Type: Permanent Experience Required: Minimum 5 years Primary Skills: Cloud - AWS (AWS Lambda, AWS EventBridge, AWS Fargate) --- Job Description We are seeking a highly skilled Senior Data Engineer to design, implement, and maintain scalable data pipelines that support machine learning model training and inference. Responsibilities: Build and maintain large-scale data pipelines ensuring scalability, reliability, and efficiency. Collaborate with data scientists to streamline the deployment and management of machine learning models. Design and optimize ETL (Extract, Transform, Load) processes and integrate data from multiple sources into structured storage systems. Automate ML workflows using MLOps tools and frameworks (e.g., Kubeflow, MLflow, TensorFlow Extended - TFX). Monitor model performance, data lineage, and system health in production environments. Work cross-functionally to improve data architecture and enable seamless ML model integration. Manage and optimize cloud platforms and data storage solutions (AWS, GCP, Azure). Ensure data security, integrity, and compliance with governance policies. Troubleshoot and optimize pipelines to improve reliability and performance. --- Required Skills Languages: Python, SQL, PySpark Cloud: AWS Services (Lambda, EventBridge, Fargate), Cloud Platforms (AWS, GCP, Azure) DevOps: Docker, Kubernetes, Containerization ETL Tools: AWS Glue, SQL Server (SSIS, SQL Packages) Nice to Have: Redshift, SAS dataset knowledge --- Mandatory Competencies DevOps/Configuration Management: Docker DevOps/Configuration Management: Cloud Platforms - AWS DevOps/Configuration Management: Containerization (Docker, Kubernetes) ETL: AWS Glue Database: SQL Server - SQL Packages

Posted 3 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities Design and build data pipelines & Data lakes to automate ingestion of structured and unstructured data that provide fast, optimized, and robust end-to-end solutions Knowledge about the concepts of data lake and dat warehouse Experience working with AWS big data technologies Improve the data quality and reliability of data pipelines through monitoring, validation and failure detection. Deploy and configure components to production environments Technology: Redshift, S3, AWS Glue, Lambda, SQL, PySpark, SQL Mandatory Skill Sets AWS Data Engineer Preferred Skill Sets AWS Data Engineer Years Of Experience Required 4-8 Education Qualification B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills AWS Development, Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 3 days ago

Apply

10.0 years

0 Lacs

India

On-site

We are seeking an experienced and hands-on Solution Architect to lead technical design and architecture for a next-generation, API-first modular banking platform . You will collaborate with client teams, SI partners, and product stakeholders to define scalable, secure, and composable system architectures built on cloud-native principles. Key Responsibilities Lead end-to-end architecture during the discovery and design phases of digital banking transformation projects. Define solution blueprints and integration maps across platforms including Mambu, Salesforce, and nCino . Align business requirements with AWS-native architecture (ECS, Lambda, S3, Glue, Redshift). Design secure, scalable microservices-based solutions using REST, GraphQL , and event-driven frameworks (Kafka). Produce high-level and low-level architecture artefacts , including data flows, API contracts, and deployment diagrams. Recommend and integrate third-party components (KYC/AML, fraud scoring, payment gateways). Collaborate closely with Integration Specialists, DevOps, and UX/Product teams. Required Skills & Experience 10+ years in solution architecture, with at least 4–5 in banking, fintech, or digital platform environments . Proven experience with core banking systems like Mambu, nCino, Salesforce , or equivalents. Hands-on expertise in AWS services (ECS, Lambda, IAM, Glue, Redshift, CloudWatch). Strong understanding of REST/GraphQL APIs , event-driven architecture (Kafka) , and microservices . Familiarity with banking compliance, data privacy , and regulatory frameworks (SOC2, GDPR, PSD2). Excellent communication and stakeholder management skills.

Posted 3 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Exciting Opportunity at Eloelo: Join the Future of Live Streaming and Social Gaming! Are you ready to be a part of the dynamic world of live streaming and social gaming? Look no further! Eloelo, an innovative Indian platform founded in February 2020 by ex-Flipkart executives Akshay Dubey and Saurabh Pandey, is on the lookout for passionate individuals to join our growing team in Bangalore. About Us: Eloelo stands at the forefront of multi-host video and audio rooms, offering a unique blend of interactive experiences, including chatrooms, PK challenges, audio rooms, and captivating live games like Lucky 7, Tambola, Tol Mol Ke Bol, and Chidiya Udd. Our platform has successfully attracted audiences from all corners of India, providing a space for social connections and immersive gaming. Recent Milestone: In pursuit of excellence, Eloelo has secured a significant milestone by raising $22Mn in the month of October 2023 from a diverse group of investors, including Lumikai, Waterbridge Capital, Courtside Ventures, Griffin Gaming Partners, and other esteemed new and existing contributors. Why Eloelo? Be a part of a team that thrives on creativity and innovation in the live streaming and social gaming space. Rub shoulders with the stars! Eloelo regularly hosts celebrities such as Akash Chopra, Kartik Aryan, Rahul Dua, Urfi Javed, and Kiku Sharda from the Kapil Sharma Show and that's our level of celebrity collaboration. Working with a world class team ,high performance team that constantly pushes boundaries and limits , redefines what is possible Fun and work at the same place with amazing work culture , flexible timings , and vibrant atmosphere We are looking to hire a data analyst to join our data team. You will take responsibility for managing our master data set, developing reports, and troubleshooting data issues. To do well in this role you need a very fine eye for detail, experience as a data analyst, and a deep understanding of the popular data analysis tools and databases. We’re looking for 1 to 3 years of experience working on innovative consumer internet products SQL Mastery: 1+ years of experience optimizing SQLs for complex queries over large tables in an analytical database like Redshift, Big Query, etc. will put you at an advantage in this role We care the most about your ability to juggle between quick time-sensitive analysis and comprehensive tear-down of user behavior. Table Design: Should be able to build optimized database tables for complex queries. Data Manipulation and Visualization in R / Python is desirable but not compulsory Experience in building or working with an early-stage tech product startup would be a plus Ability to break down complex problems, identify use cases and solutions that can be reused across multiple areas Good skills with a strong grasp of both technical and business perspectives. Proven ability to work in a fast-paced environment, and to meet changing deadlines and priorities on multiple simultaneous projects. Enjoy working in both individual and team settings. You will You are the guiding light for product, design, and business teams. You provide actionable insights which will drive feature prioritization and development. You will build dashboards and set up data pipelines to turbocharge our decision-making and pace of execution You ensure that product decisions are based on strong logical rigor and in-depth analysis, and value data over opinions Bonus Points You have gone through rapid growth in your company (from startup to mid-size) You have experience decomposing a large monolith into microservices. Experience in Google Firebase / Web sockets Experience in live streaming platform If you're ready to be a part of a groundbreaking journey and contribute to the success of Eloelo, apply now! Let's redefine the future of live streaming and social gaming together. Eloelo is an equal-opportunity employer. We encourage applicants from all backgrounds to apply. Note- Immediate Joiner Only

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies