Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Country India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Job Description Job Title – Senior Engineer ( Node.Js, React, Typescript/Javascript & AWS) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Summary Lead Engineer(Full Stack) is crucial role in product development team at Carrier. This role would focus on design and development of Backend & Frontend modules by following Carrier software development standards . Role Responsibilities Design, develop AWS IoT/Cloud-based applications using Typescript, Node.Js, ReactJS Work closely with onsite, offshore, and cross functional teams, Product Management, frontend developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery Work closely with solutions architects on low level design. Effectively plan and delegate the sprint work to the development team while also contributing individually. Proactively Identify risks and failure modes early in the development lifecycle and develop POCs to mitigate the risks early in the program This individual is self-directed, highly motivated, and organized with strong analytical thinking and problem-solving skills, and an ability to work on multiple projects and function in a team environment. Should be able to help and direct junior developers in a right direction if needed Participate in peer code reviews to ensure that respective developers are following highest standards in implementing the product. Participate in PI planning and identify any challenges in terms of technology side to implement specific Epic/Story. Keep an eye on NFR’s and ensure our product is meeting all required compliances as per Carrier standards. Minimum Requirements 6-10 years of overall experience in Software domain At least 4 years of experience in Cloud native applications in AWS Solid working knowledge of Typescript, NodeJS, ReactJS Experience in executing CI/CD processes Experience in developing APIs [REST, GraphQL, Websockets]. Knowledge of (AWS IoT Core) and In-depth knowledge of AWS cloud native services including Kinesis, DynamoDB, Lambda, API Gateway, Timestream, SQS, SNS, Cloudwatch Solid understanding of creating AWS infra using serverless framework/CDK. Experience in implementing alerts and monitoring to support smooth opera tions. Solid understanding of Jest framework (unit testing) and integration tests. Experience in cloud cost optimization and securing AWS services. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Country India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Job Title – Senior Engineer Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Summary Senior Engineer (Full Stack) is crucial role in product development team at Carrier. This role would focus on design and development of Backend & Frontend modules by following Carrier software development standards . Role Responsibilities Design, develop AWS IoT/Cloud-based applications using Typescript, Node.Js, ReactJS Work closely with onsite, offshore, and cross functional teams, Product Management, frontend developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery Work closely with solutions architects on low level design. Effectively plan and delegate the sprint work to the development team while also contributing individually. Proactively Identify risks and failure modes early in the development lifecycle and develop POCs to mitigate the risks early in the program This individual is self-directed, highly motivated, and organized with strong analytical thinking and problem-solving skills, and an ability to work on multiple projects and function in a team environment. Should be able to help and direct junior developers in a right direction if needed Participate in peer code reviews to ensure that respective developers are following highest standards in implementing the product. Participate in PI planning and identify any challenges in terms of technology side to implement specific Epic/Story. Keep an eye on NFR’s and ensure our product is meeting all required compliances as per Carrier standards. Minimum Requirements 3-7 years of overall experience in Software domain At least 2 years of experience in Cloud native applications in AWS Solid working knowledge of Typescript, NodeJS, ReactJS Experience in executing CI/CD processes Experience in developing APIs [REST, GraphQL, Websockets]. Knowledge of (AWS IoT Core) and In-depth knowledge of AWS cloud native services including Kinesis, DynamoDB, Lambda, API Gateway, Timestream, SQS, SNS, Cloudwatch Solid understanding of creating AWS infra using serverless framework/CDK. Solid understanding of Jest framework (unit testing) and integration tests. Knowledge in cloud cost optimization and securing AWS services. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Country India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Job Title – Senior Engineer Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Summary Senior Engineer (Full Stack) is crucial role in product development team at Carrier. This role would focus on design and development of Backend & Frontend modules by following Carrier software development standards . Role Responsibilities Design, develop AWS IoT/Cloud-based applications using Typescript, Node.Js, ReactJS Work closely with onsite, offshore, and cross functional teams, Product Management, frontend developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery Work closely with solutions architects on low level design. Effectively plan and delegate the sprint work to the development team while also contributing individually. Proactively Identify risks and failure modes early in the development lifecycle and develop POCs to mitigate the risks early in the program This individual is self-directed, highly motivated, and organized with strong analytical thinking and problem-solving skills, and an ability to work on multiple projects and function in a team environment. Should be able to help and direct junior developers in a right direction if needed Participate in peer code reviews to ensure that respective developers are following highest standards in implementing the product. Participate in PI planning and identify any challenges in terms of technology side to implement specific Epic/Story. Keep an eye on NFR’s and ensure our product is meeting all required compliances as per Carrier standards. Minimum Requirements 3-7 years of overall experience in Software domain At least 2 years of experience in Cloud native applications in AWS Solid working knowledge of Typescript, NodeJS, ReactJS Experience in executing CI/CD processes Experience in developing APIs [REST, GraphQL, Websockets]. Knowledge of (AWS IoT Core) and In-depth knowledge of AWS cloud native services including Kinesis, DynamoDB, Lambda, API Gateway, Timestream, SQS, SNS, Cloudwatch Solid understanding of creating AWS infra using serverless framework/CDK. Solid understanding of Jest framework (unit testing) and integration tests. Knowledge in cloud cost optimization and securing AWS services. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Country India Location: Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Job Description Job Title – Senior Engineer ( Node.Js, React, Typescript/Javascript & AWS) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Summary Lead Engineer(Full Stack) is crucial role in product development team at Carrier. This role would focus on design and development of Backend & Frontend modules by following Carrier software development standards . Role Responsibilities Design, develop AWS IoT/Cloud-based applications using Typescript, Node.Js, ReactJS Work closely with onsite, offshore, and cross functional teams, Product Management, frontend developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery Work closely with solutions architects on low level design. Effectively plan and delegate the sprint work to the development team while also contributing individually. Proactively Identify risks and failure modes early in the development lifecycle and develop POCs to mitigate the risks early in the program This individual is self-directed, highly motivated, and organized with strong analytical thinking and problem-solving skills, and an ability to work on multiple projects and function in a team environment. Should be able to help and direct junior developers in a right direction if needed Participate in peer code reviews to ensure that respective developers are following highest standards in implementing the product. Participate in PI planning and identify any challenges in terms of technology side to implement specific Epic/Story. Keep an eye on NFR’s and ensure our product is meeting all required compliances as per Carrier standards. Minimum Requirements 6-10 years of overall experience in Software domain At least 4 years of experience in Cloud native applications in AWS Solid working knowledge of Typescript, NodeJS, ReactJS Experience in executing CI/CD processes Experience in developing APIs [REST, GraphQL, Websockets]. Knowledge of (AWS IoT Core) and In-depth knowledge of AWS cloud native services including Kinesis, DynamoDB, Lambda, API Gateway, Timestream, SQS, SNS, Cloudwatch Solid understanding of creating AWS infra using serverless framework/CDK. Experience in implementing alerts and monitoring to support smooth opera tions. Solid understanding of Jest framework (unit testing) and integration tests. Experience in cloud cost optimization and securing AWS services. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. The India Data & Analytics Global Capability Centre is an integral part of ACT’s Global Data & Analytics Team and the Senior Data Scientist will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About The Role The incumbent will be responsible for delivering advanced analytics projects that drive business results including interpreting business, selecting the appropriate methodology, data cleaning, exploratory data analysis, model building, and creation of polished deliverables. Responsibilities Analytics & Strategy Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional data Operational Excellence Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analyses Stakeholder Engagement Working collaboratively across multiple sets of stakeholders – Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats Job Requirements Education Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience 3–4 years of relevant working experience in a data science/advanced analytics role Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics) Statistical modelling using Analytical tools (R, Python, KNIME, etc.) and use big data technologies Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference. Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.) Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.) Microsoft Office applications (MS Excel, etc.) Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – AWS Staff-Senior The opportunity We are looking for a skilled AWS Data Engineer to join our growing data team. This role involves building and managing scalable data pipelines that ingest, process, and store data from various sources using modern AWS technologies. You will work with both batch and streaming data and contribute to a robust, scalable data architecture to support analytics, BI, and data science use cases. As a problem-solver with the keen ability to diagnose a client’s unique needs, one should be able to see the gap between where clients currently are and where they need to be. The candidate should be capable of creating a blueprint to help clients achieve their end goal. Key Responsibilities: Design and implement data ingestion pipelines from various sources including on-premise Oracle databases, batch files, and Confluent Kafka. Develop Python producers and AWS Glue jobs for batch data processing. Build and manage Spark streaming applications on Amazon EMR. Architect and maintain Medallion Architecture-based data lakes on Amazon S3. Develop and maintain data sinks in Redshift and Oracle. Automate and orchestrate workflows using Apache Airflow. Monitor, debug, and optimize data pipelines for performance and reliability. Collaborate with cross-functional teams including data analysts, scientists, and DevOps. Required Skills and Experience: Good programming skills in Python and Spark (Pyspark). Hands on Experience with Amazon S3, Glue, EMR. Good SQL knowledge on Amazon Redshift and Oracle Proven experience in handling streaming data with Kafka and building real-time pipelines. Good understanding of data modeling, ETL frameworks, and performance tuning. Experience with workflow orchestration tools like Airflow. Nice-to-Have Skills: Infrastructure as Code using Terraform. Experience with AWS services like SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Familiarity with DataSync for file movement and medallion architecture for data lakes. Monitoring and alerting using CloudWatch, Datadog, or Splunk. Qualifications : BTech / MTech / MCA / MBA EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
1.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Description Python Developer with AWS At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Python Developer with AWS CGI Position: Software Engineer Experience: 1- 4 years Main location: Bangalore location (1st preference), Gurgaon Employment Type: Full Time Your future duties and responsibilities The incumbent should have, Hands on experience in Python for Data Engineering and delivering projects in Data related projects using different AWS technologies like S3, Lambda, DynamoDB, RDS Postgres SQL, Step Functions, Kinesis, Docker & K8S. Working knowledge of AWS environment (S3, CLI, Lambda, RDS Postgres SQL) The incumbent would be responsible to understand & develop Data Flows for Contact Centres based on Amazon Connect using Python, Lambda, S3, Kinesis, Postgres SQL . S/he would work towards creating a positive and innovation friendly environment. Programming: Python, SQL (Intermediate),boto3, AWS SDK Design and develop Data Flows using AWS Lambda, S3, Kinesis, Postgres SQL. Experience on Amazon Connect to build Call Center Contact Flows will be an added advantage. Accountability Core Responsibility: Strong experience in delivering projects in using Python. Exposure of working in Global environment and have delivered at-least 1-2 projects on Python. Delivery collaboration & coordination with multiple business partners. Must have good experience in leading projects. Good to have Industry knowledge of the Insurance industry with proven experience across multiple clients Implemented the developed methodology on Cloud, using AWS services like S3, Lambda, SQS, SNS, RDS Postgres SQL, DynamoDB. Good to have Call Center experience using Amazon Connect to build Contact Flows (IVR Flows) will be an added advantage. Required Qualifications To Be Successful In This Role Bachelors in Technology(B.Tech) Computer Engineering or Masters in computer application(MCA) Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world. Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Requirements Years of Experience: 5+ years in GIS development, with a strong emphasis on web-based GIS applications and tools. Job Description: We are looking for an experienced and innovative Senior GIS Developer to join our team. The ideal candidate will have an in-depth understanding of Geographic Information Systems (GIS) principles, coupled with expertise in web development technologies. You will play a critical role in designing, implementing, and maintaining GIS applications, leveraging modern web frameworks and tools. This position offers the opportunity to work on cutting-edge GIS projects and contribute to the advancement of spatial data analysis and visualization. Required Skills: GIS Fundamentals: Strong understanding of Geographic Information Systems (GIS) principles and concepts. Expertise in spatial data management, map projections, and coordinate systems. Proficiency with GIS tools and libraries, including Leaflet, ArcGIS, or Mapbox. Familiarity with GIS software (e.g., ArcGIS, QGIS) and geospatial data formats (e.g., shapefiles, GeoJSON). Web Development Skills: Proficiency in JavaScript libraries and frameworks. Strong front-end development skills in HTML, CSS, and responsive design principles. Experience with web frameworks such as Angular (preferred) or similar frameworks. Desired Skills: Server-Side Development: Familiarity with server-side languages like Node.js or Python. Experience working with RESTful APIs and web services. Database Management: Knowledge of spatial databases such as PostGIS or experience with SQL/NoSQL databases like PostgreSQL, MongoDB, or DynamoDB. DevOps and Cloud Technologies: Familiarity with CI/CD pipelines for efficient deployment processes. Basic understanding of cloud platforms like AWS, Azure, or Google Cloud. Testing and Quality Assurance: Experience with automated testing tools and frameworks. Soft Skills: Excellent analytical and problem-solving skills. Strong communication and collaboration abilities. Ability to work effectively in an agile environment. Responsibilities: Design, develop, and maintain GIS applications with a focus on performance, scalability, and usability. Implement innovative solutions for spatial data visualization and analysis. Collaborate with cross-functional teams, including product owners, designers, and developers, to deliver high-quality GIS solutions. Optimize application performance and ensure seamless integration with GIS tools and libraries. Stay updated on emerging GIS and web development technologies and trends. Provide technical guidance and mentorship to junior developers in the team. This role is ideal for someone passionate about GIS and web technologies, looking to take on a leadership position in a challenging and rewarding environment.
Posted 2 weeks ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Title: Data Architect Employment Type: Full time (hybrid) Location: Hyderabad Position Summary: As a Data Architect, you will play a pivotal role in leading and mentoring data engineering teams, architecture and designing robust data solutions on AWS, and serving as the primary technical point of contact for clients. You will leverage your deep expertise in AWS cloud technologies, data engineering best practices, and leadership skills to drive successful project delivery and ensure client satisfaction. Key Responsibilities: Technical Leadership: Provide technical guidance and mentorship to data engineering teams, fostering a culture of innovation and continuous improvement. Solution Architecture: Design and architect scalable, high-performance data solutions on AWS, leveraging services such as S3, Glue, Lambda, EMR, Redshift, DynamoDB, Kinesis, and Athena to meet client requirements. Client Engagement: Serve as the primary technical point of contact for clients, understanding their business needs, and translating them into effective data solutions. Project Management: Lead and manage end-to-end project delivery, ensuring projects are completed on time, within budget, and to the highest quality standards. Data Modeling: Develop and maintain comprehensive data models that support analytics, reporting, and machine learning use cases. Performance Optimization: Continuously monitor and optimize data pipelines and systems to ensure optimal performance and cost-efficiency, utilizing tools like CloudWatch and AWS Cost Explorer. Technology Evangelism: Stay abreast of the latest AWS technologies and industry trends, and advocate for their adoption within the organization. Team Collaboration: Foster strong collaboration with data scientists, analysts, and other stakeholders to ensure alignment and successful project outcomes. Technical Skills and Expertise: AWS Services: Deep understanding of core AWS services for data engineering, including S3, Glue, Lambda, EMR, Redshift, DynamoDB, Kinesis, Athena, CloudWatch, and IAM. Programming Languages: Proficiency in Python, SQL, and PySpark for data processing, transformation, and analysis. Data Warehousing and ETL: Expertise in designing and implementing data warehousing solutions, ETL processes, and data modeling techniques. Infrastructure as Code (IaC): Experience with tools like CloudFormation or Terraform to automate and manage infrastructure provisioning. Big Data Technologies: Familiarity with big data frameworks like Apache Spark and Hadoop for handling large-scale data processing. Skills and Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 10+ years of experience in data engineering, with at least 3+ years in a technical leadership role. Deep expertise in AWS cloud technologies and data services. Proven track record in architecture and delivering complex data solutions on AWS. Strong leadership, communication, and client-facing skills. Experience in data modeling, ETL processes, and data warehousing. Excellent problem-solving, analytical, and decision-making skills. Requirements Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 10+ years of experience in data engineering, with at least 3+ years in a technical leadership role. Deep expertise in AWS cloud technologies and data services. Proven track record in architecture and delivering complex data solutions on AWS. Strong leadership, communication, and client-facing skills. Experience in data modeling, ETL processes, and data warehousing. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad, India - Hybrid About The Role As a Senior Software Engineer at Deliveroo, your individual work contributes to achieving goals in multiple teams. While you will work with your team and lead projects, some of your work will contribute outside of your direct remit. You will report to managers and groups leads and together deliver the results. Technical Execution What you'll be doing You will improve code structure, architecture, review code of any scope produced by your team. It will also include work to maximise the efficiency of your team by leading team project planning, foreseeing dependencies and risks, and constructively partnering with other disciplines (e.g. PM, Experience) You'll aim to simplify the maintenance and operation of production systems, promoting visibility, operational readiness, and health of your team's systems. Collaboration & Leadership As well as leading from the front regarding technical execution, you'll build relationships with other engineering teams and, identify collaboration opportunities. You'll break down large pieces of work, guide design and technical / implementation choices and influence the roadmap within your team. You will take an active role in the hiring process and conducting engineering interviews. This will also extend to the current team where you will support the personal growth of colleagues, encouraging efficiency in their roles. Requirements We want to emphasise that we don't expect you to meet all of the below but would love you to have experience in some of these areas. Pride in readable, well-designed, well-tested software Experience writing web-based applications in any language, and an interest in learning (Go, Ruby/Rails, Python, Scala, or Rust) Familiarity and practical experience with relational databases (PostgreSQL, MySQL) Familiarity and practical experience with web architecture at scale (20krpm and above) Familiarity and practical experience with "NoSQL" data backends and other such as Redis, DynamoDB, ElasticSearch, Memcache. Why Deliveroo? Our mission is to transform the way you shop and eat, bringing the neighbourhood to your door by connecting consumers, restaurants, shops and riders. We are transforming the way the world eats and shops by making access to food and products more convenient and enjoyable. We give people the opportunity to buy what they want, as they want it, when and where they want it. We are a technology-driven company at the forefront of the most rapidly expanding industry in the world. We are still a small team, making a very large impact, looking to answer some of the most interesting questions out there. We move fast, value autonomy and ownership, and we are always looking for new ideas. Workplace & Benefits At Deliveroo we know that people are the heart of the business and we prioritise their welfare. Benefits differ by country, but we offer many benefits in areas including healthcare, well-being, parental leave, pensions, and generous annual leave allowances, including time off to support a charitable cause of your choice. Benefits are country-specific, please ask your recruiter for more information. Diversity At Deliveroo, we believe a great workplace is one that represents the world we live in and how beautifully diverse it can be. That means we have no judgement when it comes to any one of the things that make you who you are - your gender, race, sexuality, religion or a secret aversion to coriander. All you need is a passion for (most) food and a desire to be part of one of the fastest-growing businesses in a rapidly growing industry. We are committed to diversity, equity and inclusion in all aspects of our hiring process. We recognise that some candidates may require adjustments to apply for a position or fairly participate in the interview process. If you require any adjustments, please don't hesitate to let us know. We will make every effort to provide the necessary adjustments to ensure you have an equitable opportunity to succeed. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – AWS Staff-Senior The opportunity We are looking for a skilled AWS Data Engineer to join our growing data team. This role involves building and managing scalable data pipelines that ingest, process, and store data from various sources using modern AWS technologies. You will work with both batch and streaming data and contribute to a robust, scalable data architecture to support analytics, BI, and data science use cases. As a problem-solver with the keen ability to diagnose a client’s unique needs, one should be able to see the gap between where clients currently are and where they need to be. The candidate should be capable of creating a blueprint to help clients achieve their end goal. Key Responsibilities: Design and implement data ingestion pipelines from various sources including on-premise Oracle databases, batch files, and Confluent Kafka. Develop Python producers and AWS Glue jobs for batch data processing. Build and manage Spark streaming applications on Amazon EMR. Architect and maintain Medallion Architecture-based data lakes on Amazon S3. Develop and maintain data sinks in Redshift and Oracle. Automate and orchestrate workflows using Apache Airflow. Monitor, debug, and optimize data pipelines for performance and reliability. Collaborate with cross-functional teams including data analysts, scientists, and DevOps. Required Skills and Experience: Good programming skills in Python and Spark (Pyspark). Hands on Experience with Amazon S3, Glue, EMR. Good SQL knowledge on Amazon Redshift and Oracle Proven experience in handling streaming data with Kafka and building real-time pipelines. Good understanding of data modeling, ETL frameworks, and performance tuning. Experience with workflow orchestration tools like Airflow. Nice-to-Have Skills: Infrastructure as Code using Terraform. Experience with AWS services like SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Familiarity with DataSync for file movement and medallion architecture for data lakes. Monitoring and alerting using CloudWatch, Datadog, or Splunk. Qualifications : BTech / MTech / MCA / MBA EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – AWS Staff-Senior The opportunity We are looking for a skilled AWS Data Engineer to join our growing data team. This role involves building and managing scalable data pipelines that ingest, process, and store data from various sources using modern AWS technologies. You will work with both batch and streaming data and contribute to a robust, scalable data architecture to support analytics, BI, and data science use cases. As a problem-solver with the keen ability to diagnose a client’s unique needs, one should be able to see the gap between where clients currently are and where they need to be. The candidate should be capable of creating a blueprint to help clients achieve their end goal. Key Responsibilities: Design and implement data ingestion pipelines from various sources including on-premise Oracle databases, batch files, and Confluent Kafka. Develop Python producers and AWS Glue jobs for batch data processing. Build and manage Spark streaming applications on Amazon EMR. Architect and maintain Medallion Architecture-based data lakes on Amazon S3. Develop and maintain data sinks in Redshift and Oracle. Automate and orchestrate workflows using Apache Airflow. Monitor, debug, and optimize data pipelines for performance and reliability. Collaborate with cross-functional teams including data analysts, scientists, and DevOps. Required Skills and Experience: Good programming skills in Python and Spark (Pyspark). Hands on Experience with Amazon S3, Glue, EMR. Good SQL knowledge on Amazon Redshift and Oracle Proven experience in handling streaming data with Kafka and building real-time pipelines. Good understanding of data modeling, ETL frameworks, and performance tuning. Experience with workflow orchestration tools like Airflow. Nice-to-Have Skills: Infrastructure as Code using Terraform. Experience with AWS services like SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Familiarity with DataSync for file movement and medallion architecture for data lakes. Monitoring and alerting using CloudWatch, Datadog, or Splunk. Qualifications : BTech / MTech / MCA / MBA EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – AWS Staff-Senior The opportunity We are looking for a skilled AWS Data Engineer to join our growing data team. This role involves building and managing scalable data pipelines that ingest, process, and store data from various sources using modern AWS technologies. You will work with both batch and streaming data and contribute to a robust, scalable data architecture to support analytics, BI, and data science use cases. As a problem-solver with the keen ability to diagnose a client’s unique needs, one should be able to see the gap between where clients currently are and where they need to be. The candidate should be capable of creating a blueprint to help clients achieve their end goal. Key Responsibilities: Design and implement data ingestion pipelines from various sources including on-premise Oracle databases, batch files, and Confluent Kafka. Develop Python producers and AWS Glue jobs for batch data processing. Build and manage Spark streaming applications on Amazon EMR. Architect and maintain Medallion Architecture-based data lakes on Amazon S3. Develop and maintain data sinks in Redshift and Oracle. Automate and orchestrate workflows using Apache Airflow. Monitor, debug, and optimize data pipelines for performance and reliability. Collaborate with cross-functional teams including data analysts, scientists, and DevOps. Required Skills and Experience: Good programming skills in Python and Spark (Pyspark). Hands on Experience with Amazon S3, Glue, EMR. Good SQL knowledge on Amazon Redshift and Oracle Proven experience in handling streaming data with Kafka and building real-time pipelines. Good understanding of data modeling, ETL frameworks, and performance tuning. Experience with workflow orchestration tools like Airflow. Nice-to-Have Skills: Infrastructure as Code using Terraform. Experience with AWS services like SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Familiarity with DataSync for file movement and medallion architecture for data lakes. Monitoring and alerting using CloudWatch, Datadog, or Splunk. Qualifications : BTech / MTech / MCA / MBA EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Role Grade Level (for internal use): 13 The Team: We are seeking a seasoned engineering leader to join us and lead our technology team. In this role, you will be leading by example and responsible for executing our strategy to modernize the existing platform and making it scalable and cost efficient. You’ll work closely with cross-functional teams to ensure seamless transitions and optimal performance. Responsibilities And Impact In this role, you will have the opportunity to lead a highly skilled and technical team currently working in Agile model, ensuring we meet our customer requirements and deliver impactful quality software. Moreover, you are required to exhibit the below responsibilities as well: Execute the engineering strategy, ensuring alignment with business objectives, technology roadmaps, and industry trends. Lead and oversee multiple engineering teams, fostering a high-performance culture focused on innovation, scalability, and delivery excellence. Architect and govern large-scale, distributed systems and enterprise-level solutions, ensuring technical best practices and design principles are followed. Shape the technology vision, evaluating emerging trends and recommending strategic investments in tools, frameworks, and infrastructure. Establish and enforce engineering excellence, including coding standards, hygiene, architectural guidelines, security practices, and automation. Lead technical governance and decision-making, balancing innovation with risk management, cost efficiency, and long-term maintainability. Collaborate with software architects and developers to assess existing applications. Design and implement modernization strategies, including refactoring, containerization, and microservices adoption. Develop and maintain scalable, secure, and efficient solutions on AWS. Optimize application performance, reliability, and scalability. Conduct code reviews and provide constructive feedback. Troubleshoot and resolve issues related to application modernization. Stay up to date with industry trends and best practices in development and AWS services. What We’re Looking For Bachelor's degree in computer science, Engineering, or related field. 15+ years of experience in software development with a strong focus on AWS and .NET technologies 8+ years of experience in leading the engineering teams Proven experience in technical leadership, mentoring engineers, and driving architectural decisions. Expert proficiency in C# and .NET Core. Advanced SQL programming with expertise in database performance tuning for large-scale datasets. Strong experience with relational (MS SQL, PostgreSQL) or NoSQL databases (MongoDB, DynamoDB, etc.). Knowledge of UI, Python is a plus. Hands on and design level experience in designing AWS cloud-native services. Strong knowledge about CI/CD for automated deployments. Hands-on experience with large-scale messaging systems or commercial equivalents. Proven ability to lead and mentor engineering teams, fostering a culture of technical excellence and innovation. Strong problem-solving skills and ability to work in a collaborative manner. Excellent communication and teamwork abilities. Basic Required Qualifications Education & Experience: Bachelor’s degree in computer science, Software Engineering, or a related field (or equivalent practical experience) Soft Skills Strong problem-solving skills and attention to detail Excellent communication skills and the ability to collaborate in a team environment Ability to handle multiple tasks and meet deadlines in a fast-paced environment About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316302 Posted On: 2025-05-29 Location: Noida, Uttar Pradesh, India Show more Show less
Posted 2 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
We are looking out for candidates who will be working with product development team to develop and maintain high quality mobile application for our Software. Youll collaborate with internal teams to develop functional mobile applications, while working in a fast-paced environment. You should be technically strong in programming, especially in popular application development frameworks and the technologies and skills mentioned below. Work with the team/project in various phases of SDLC including Design, System Architecture, Test case design, Deployment, Technology research etc. Ultimately, you should be able to design and build the next generation of our mobile applications and a go-getter with good initiative and inter personal skills. Desired Profile: Overall 5+ years in Software Development with at least 3 years in developing PHP, My SQL application. Must have PHP, My SQL, Dynamodb, Word press. Having AI, E-Business development, Cloud Computing and Design, Distributed Architecture will carry extra weightage for your profile
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Attention all Python Full Stack Developers! Join a dynamic and growing company where you can utilize your leadership skills and technical expertise to drive innovation and make a real impact. We are seeking a talented and motivated FullStack Developer to join our dynamic team. Our ideal candidate should have a passion for technology, a strong understanding of software development principles, and a desire to continuously learn and grow their skills. As a FullStack Developer, you will have the opportunity to work on a variety of projects developing server-side logic for web-based applications that is market leading, highly accessed commercial digital content platform based on a high-performance Consumer Electronics device. You will be responsible for overseeing the development process from start to finish, working closely with our design and product teams to ensure that our applications meet the needs of our users and exceed their expectations. You will be working with the latest technologies and frameworks and will have the opportunity to shape the direction of our development strategy as we grow and evolve. We offer a supportive and inclusive work environment, and opportunities for professional growth and advancement. If you are ready to take your career to the next level and build the future, apply today! Primary Skills Strong understanding of database management systems (DBMS) such as MySQL, PostgreSQL, or Oracle. Strong proficiency in front-end technologies (HTML, CSS, JavaScript, React, Angular, or Vue.js). Experience with database design and data modeling. Strong experience as a Full Stack Developer with expertise in Python and web frameworks and also FastAPI or other popular API frameworks in Python Experience with Microservices based System design & deployment. Experience working with Third-Party Collaboration (e.g. TCMS) Experience building high performance and scalable web services Solid understanding of MVC and stateless APIs & building RESTful APIs Strong sense of ownership for the end product and passionate about writing high quality and well-architected code. Experience with one or more NoSQL databases such as DynamoDB or Redis Experience with MySQL or other relational database systems Experience with version control concepts and Git Experience with AWS and AWS Infrastructure, Platform and Services is an advantage Experience and comfort building cloud-native applications Familiarity on Frontend Development Familiarity with code versioning tools such as Git Strong problem-solving and analytical skills Excellent communication, Presentation, and collaboration skills Passion for staying up-to-date with the latest industry trends and technologies. Bilingual (Japanese) will be a plus Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Are you a passionate Business Intelligence Engineer (BI Engineer) who wants to make a real impact on a big business? Read on! We are looking for seasoned Business Intelligence Engineers to join a BI team in the AWS Support organization. AWS Support, despite the impression that its name (“support”) might give you, is not a typical reactive customer service, but an independent AWS business that provides industry-leading offerings way beyond just “break-fix”. The business has a self-standing P&L and has even yielded revenue larger than most AWS Services. Managing an organization that provides support for ever-expanding AWS product portfolio to millions of customers across the globe is not a simple task and leaders in all functions inevitably need reliable data for their day-to-day operations and strategic decision-makings. The BI team is an integrated core part of the AWS Support operations delivering robust and trustworthy infrastructure to the truly data-driven organization. As BI Engineer, you will work closely with internal stakeholders to define key performance indicators (KPIs) through your deep understanding in support business and operations and implement the KPIs into dashboards/reports that drive the decisions made by senior leadership. The BI Engineers will have opportunities to not just exercise her/his technical skills such as SQL to retrieve data and convert it to simple graphs and tables, but to develop true dashboards that are information-rich and flexible yet intuitive and easy-to-use that help the support leadership quickly discover golden insights to serve AWS customers better. You will let the data answer questions such as “What is high/low quality means at support?”, “Is this particular customer happy with their support experience? “, and “How productive is this support agent versus other agents in the network?” The usage of the data that the AWS Support BI team publishes spans to other AWS teams outside of the organization. Support data serves as one of key leading indicators for new customer demands and for problems yet to be discovered. Such external usage might lead to “next big thing” launched by AWS! From technology perspectives, you will have opportunities to (and will be asked to) be exposed to the modern agile cloud-based data technologies. The team will be empowered to select the right technology, if necessary, based on customer needs and you will have the full set of AWS services in your toolbox. Key job responsibilities Key job responsibilities Understand the problem that's loosely defined or structured. Provides BI solutions for difficult problems and works on delivering large BI solutions. Provides solutions that drive team's business decisions and highlight new opportunities. Improves code quality and optimizes BI processes. Basic understanding of a scripting language. Knows how to model data and design a data pipeline. Able to apply basic statistical methods (e.g. regression) for difficult business problems. About The Team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship and Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better- rounded professional. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - AWS India - Karnataka Job ID: A2825020 Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio Join the team as Twilio’s next Staff, Machine Learning Engineer in our Identity Resolution team. About The Job Twilio is a leader in providing innovative solutions to complex data challenges, helping businesses better understand their customers. Twilio is at the forefront of transforming how businesses understand and engage with their customers. Leveraging advanced identity resolution technologies, we unify and enrich disparate data sources to create comprehensive customer profiles. As a Staff Engineer specializing in Machine Learning and Feature Engineering for the Identity Resolution team, you will drive the development of sophisticated ML models and advanced feature engineering techniques designed to enhance our identity resolution capabilities. Your work will play a critical role in improving our systems' ability to match and unify customer identities accurately, enabling more personalized customer experiences and strategic business insights. Your role will also involve building robust data infrastructure to support and scale our machine learning initiatives. To thrive in this role, you must have a deep background in ML engineering, and a consistent track record of solving data & machine-learning problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization Responsibilities In this role, you’ll: Design, implement, and refine machine learning models that improve the precision and recall of identity resolution algorithms. Develop and optimize feature engineering methodologies to extract meaningful patterns from large and complex datasets that enhance identity matching and unification. Develop and maintain scalable data infrastructure to support the deployment and training of machine learning models, ensuring that they run efficiently under varying loads. Build and maintain scalable machine learning solutions in production Train and validate both deep learning-based and statistical-based models considering use-case, complexity, performance, and robustness Demonstrate end-to-end understanding of applications and develop a deep understanding of the “why” behind our models & systems Partner with product managers, tech leads, and stakeholders to analyze business problems, clarify requirements and define the scope of the systems needed Ensure high standards of operational excellence by implementing efficient processes, monitoring system performance, and proactively addressing potential issues. Drive engineering best practices around code reviews, automated testing and monitoring Qualifications Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required: 7+ years of applied ML experience. Proficiency in Python, Java or Golang is preferred. Extensive experience in feature engineering and developing data-driven frameworks that enhance identity matching algorithms. Strong background in the foundations of machine learning and building blocks of modern deep learning Deep understanding of machine learning frameworks and libraries such as TensorFlow, PyTorch, or Scikit-learn. Experience with big data technologies like Apache Spark or Hadoop, and familiarity with cloud platforms (AWS, Azure, Google Cloud) for scalable data processing. Familiarity with ML Ops concepts related to testing and maintaining models in production such as testing, retraining, and monitoring. Experienced with modern data storage, messaging, and processing tools (Kafka, Apache Spark, Hadoop, Presto, DynamoDB etc.) and demonstrated experience designing and coding in big-data components such as DynamoDB or similar Experience working in an agile team environment with changing priorities Experience of working on AWS Desired Exposure to Advertising Technology, Marketing Technology domains. Experience designing and implementing highly available, performant, and fault-tolerant distributed systems that provide durable and (eventually) consistent results. Experience with Large Language Models Location This role will be remote, based in India (Karnataka, Maharashtra, New Delhi, Tamil Nadu, Telangana) Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law. Twilio is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, please contact us at accommodations@twilio.com. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Data Engineer – Data Quality, Ingestion & API Development Job Overview We are seeking an experienced Senior Data Engineer to lead the development of a scalable data ingestion framework while ensuring high data quality and validation. The successful candidate will also be responsible for designing and implementing robust APIs for seamless data integration. This role is ideal for someone with deep expertise in building and managing big data pipelines using modern AWS-based technologies, and who is passionate about driving quality and efficiency in data processing systems. Key Responsibilities Data Ingestion Framework: Design & Development: Architect, develop, and maintain an end-to-end data ingestion framework that efficiently extracts, transforms, and loads data from diverse sources. Framework Optimization: Use AWS services such as AWS Glue, Lambda, EMR, ECS , EC2 and Step Functions to build highly scalable, resilient, and automated data pipelines. Data Quality & Validation: Validation Processes: Develop and implement automated data quality checks, validation routines, and error-handling mechanisms to ensure the accuracy and integrity of incoming data. Monitoring & Reporting: Establish comprehensive monitoring, logging, and alerting systems to proactively identify and resolve data quality issues. API Development: Design & Implementation: Architect and develop secure, high-performance APIs to enable seamless integration of data services with external applications and internal systems. Documentation & Best Practices: Create thorough API documentation and establish standards for API security, versioning, and performance optimization. Collaboration & Agile Practices: Cross-Functional Communication: Work closely with business stakeholders, data scientists, and operations teams to understand requirements and translate them into technical solutions. Agile Development: Participate in sprint planning, code reviews, and agile ceremonies, while contributing to continuous improvement initiatives and CI/CD pipeline development (using tools like GitLab). Required Qualifications Experience & Technical Skills: Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development. Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation. AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks. Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift. API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems. CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies. Soft Skills: Strong problem-solving abilities and attention to detail. Excellent communication and interpersonal skills with the ability to work independently and collaboratively. Capacity to quickly learn and adapt to new technologies and evolving business requirements. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Experience with additional AWS services such as Kinesis, Firehose, and SQS. Familiarity with data lakehouse architectures and modern data quality frameworks. Prior experience in a role that required proactive data quality management and API- driven integrations in complex, multi-cluster environments. Skills: ci/cd,aws,python,agile,pyspark,data engineering,data quality & validation,aws services (glue, lambda, emr, ecs, ec2, step functions),sql,data ingestion framework,data,api development Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. About Team The RBS team is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Overview Of The Role An candidate will be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. You will be detail-oriented and organized, capable of handling multiple projects at once, and capable of dealing with ambiguity and rapidly changing priorities. You will have expertise in process optimizations and systems thinking and will be required to engage directly with multiple internal teams to drive business projects/automation for the RBS team. Candidates must be successful both as individual contributors and in a team environment, and must be customer-centric. Our environment is fast-paced and requires someone who is flexible, detail-oriented, and comfortable working in a deadline-driven work environment. Responsibilities Include Works across team(s) and Ops organization at country, regional and/or cross regional level to drive improvements and enables to implement solutions for customer, cost savings in process workflow, systems configuration and performance metrics. Basic Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field Proficiency in automation using Python Excellent oral and written communication skills Experience with SQL, ETL processes, or data transformation Preferred Qualifications Experience with scripting and automation tools Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Knowledge of AWS services such as SQS, SNS, CloudWatch and DynamoDB Understanding of DevOps practices, including CI/CD pipelines and monitoring solutions Understanding of cloud services, serverless architecture, and systems integration Key job responsibilities As a Business Intelligence Engineer in the team, you will collaborate closely with business partners, architect, design, implement, and BI projects & Automations. to build scalable data solutions and analytical insight. You will work with a team of highly motivated business intelligence engineers, Business Analysts and Contribute to the design, implementation, and delivery of complex BI solutions Adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Basic Qualifications Develop innovative BI and data analytics solutions leveraging advanced machine learning models to solve complex problems, build intelligent knowledge bases, and create conversational interfaces for enhanced business insights and recommendations.- Drive projects on data automation, governance and standardisation 5+ years of relevant professional experience in business intelligence, analytics, statistics, data engineering, data science or related field. Experience with programming languages packages such as Python, TypeScript or R. Experience with Data modeling, SQL, ETL, Data Warehousing and Data Lakes. Strong experience with engineering and operations best practices (version control, data quality/testing, monitoring, etc.) Expert-level SQL. Proficiency with one or more general purpose programming languages (e.g. Python, Java, Typescript, Scala, etc.) Knowledge of AWS products such as Redshift, Quicksight, and Lambda. Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Preferred Qualifications Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Experience with AWS solutions such as Step-Functions, EC2, DynamoDB, S3. Knowledge of machine learning techniques and concepts. Understanding of cloud services, serverless architecture, and systems integration Strong understanding of machine learning concepts, including the development and deployment of ML models to solve complex problems. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ - K20 Job ID: A2899180 Show more Show less
Posted 2 weeks ago
4.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview: NationsBenefits is the leading provider of supplemental benefits, flex cards, and membernengagement solutions that partner with managed care organizations to provide innovative healthcare solutions designed to drive growth, improve outcomes, reduce costs, and delight members. Our comprehensive suite of innovative supplemental benefits, payments platform, and member engagement solutions help health plans deliver high-quality benefit to their members, addressing social determinants of health and improving member health outcomes and satisfaction. With a compliance-focused infrastructure, proprietary technology systems, and premier service delivery model, we enable our health plan partners to deliver high-quality, value-based care to millions of members. We offer a fulfilling work environment that attracts top talent and encourages all associatesto do their part in delivering premier service to internal and external customers alike. It’s howwe’re transforming the healthcare industry for the better. We provide career advancement opportunities within the organization with multiple locations in Florida, California, Pennsylvania, Tennessee, Texas, Utah, and India. You might also like to know that NationsBenefits is also recognized as one of the fastest growing companies in America. We’re proud of how far we’ve come, and a career with us alsogives you growth opportunities. Job Summary: We are looking for a highly skilled .NET Backend Developer with 4-10 years of experience to join our dynamic team in Hyderabad. The ideal candidate should have strong expertise in C#, .NET Core, Web API, and SQL Server, along with experience in cloud technologies such as Azure or AWS. The role involves designing, developing, and optimizing scalable backend systems for enterprise applications. Key Responsibilities: Develop and maintain robust, scalable, and high-performance backend solutions using NET Core / .NET Framework. Design, implement, and manage RESTful APIs and Web Services. Optimize application performance and ensure security best practices are followed. Work with SQL Server / PostgreSQL / MySQL, writing complex queries, stored procedures, and optimizing database performance. Implement cloud-based solutions using Azure / AWS, including serverless computing, microservices, and containerization. Collaborate with front-end developers, architects, and other stakeholders to define and implement new features. Utilize unit testing frameworks (NUnit, xUnit, MSTest) for writing testable code. Work with DevOps teams to implement CI/CD pipelines for smooth deployment. Troubleshoot production issues and provide resolutions in a timely manner. Required Skills & Experience: Strong programming skills in C# and .NET Core / .NET Framework (at least 4 years of hands-on experience). Expertise in building and consuming REST APIs using Web API. Experience with SQL Server / PostgreSQL / MySQL, database design, indexing, and query optimization. Proficiency in Entity Framework / Dapper for ORM. Knowledge of microservices architecture and event-driven development (RabbitMQ, Kafka, etc.). Experience with Azure / AWS services (Azure Functions, Lambda, CosmosDB, API Gateway, etc.). Strong understanding of authentication & authorization (JWT, OAuth, OpenID Connect). Hands-on experience with Docker and Kubernetes (preferred but not mandatory). Exposure to DevOps, CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins, etc.). Excellent debugging, problem-solving, and analytical skills. Preferred Skills: Experience with NoSQL databases like MongoDB, Redis, or DynamoDB. Knowledge of gRPC, GraphQL, or WebSockets. Understanding of clean architecture, DDD (Domain-Driven Design), and SOLID principles. Familiarity with Agile development methodologies (Scrum, Kanban). Benefits: Competitive salary and performance-based incentives. Flexible work hours and hybrid work model. Exposure to the latest cloud technologies and enterprise-level projects. Learning and development opportunities. Health insurance and other benefits. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
About Zeller We believe that businesses of all sizes deserve better financial services and payment products. Australian businesses are amazingly entrepreneurial, driven and passionate, yet when they seek important products from local financial services companies, they are let down by slow applications, protracted onboarding, opaque pricing, restrictive contracts and forced to use outdated solutions that no longer meet the innovative requirements of a modern business. Our company, backed by leading VCs, is a collective of experienced payment and tech industry professionals who are aiming to redefine business banking and the way Australian businesses get paid by their customers. With an exciting roadmap of innovative new products under development we are building a high performing team to take on the incumbents. If you are passionate about innovation, thrive in fast-moving environments, love a challenge, hate bureaucracy and can’t think of anything more exciting than disrupting the banks, we’re putting together a team you might want to join. ₹0 - ₹30 a month Role description As Zeller continues to scale rapidly we're seeking a talented Software + Site Reliability Engineer to join our innovative team and be part of a 24/7 team that is responsible for a mission critical system. As a Software + Site Reliability Engineer you will be responsible for both developing, deploying new, enhancing and operating existing software systems. This means you will get to wear two hats as both a Senior Software Engineer and a Site Reliability Engineer. You will get satisfaction from the high impact contribution of your work and fulfilment of the mindsets embracing teamwork, care factor and the motto “You build it, you own it”. You will have excellent software engineering skills and knowledge of event-sourcing architecture and cloud native architecture in AWS. Skills And Qualifications Minimum of a Bachelor degree in software engineering (or related); 5-9 years Experience as both a Software Engineer and DevOps (or equivalent to Site Reliability) Strong background in software engineering and design patterns; Mastery of cloud-native application development in AWS, including serverless (Lambda, DynamoDB) and container-based (ECS) solutions Knowledge in architecture patterns such as; CQRS, event-sourcing; Proficient in Typescript, NodeJS , Java; Expertise in API design (RESTful, GraphQL, Webhooks) and database management (SQL, NoSQL) Passion for clean code, automated testing (TDD, BDD) , and maintaining zero technical debt Track record of supporting rapid, agile deployments across multiple environments Proven track record in developing and maintaining mission-critical high-load production systems with SLA 99.999 %; Your attributes Loves challenging the status-quoAbility to work autonomously yet collaboratively Prepared to be bold yet consistent with your engineering principles Logical, ethical, mature and responsibleFast learner, humble and loves to share knowledge Calm and exercises positive level of stress in exceptional circumstances such as; production issues, timeline requirements Bonus points Experience in working within a high-growth environmentFamiliarity with other cloud platforms (Azure, Google) Experience In Other Programming Languages Additional programming language expertiseExperience with PCI compliant environments (PCI-DSS, etc) What’s in it for you Be part of something big from the outsetWatch your design work put up in lightsE njoy a balanced, progressive, and supportive work culture Opportunities for rapid growth and learning Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Ernakulam, Kerala, India
On-site
Job Title: Senior Data Engineer – Data Quality, Ingestion & API Development Job Overview We are seeking an experienced Senior Data Engineer to lead the development of a scalable data ingestion framework while ensuring high data quality and validation. The successful candidate will also be responsible for designing and implementing robust APIs for seamless data integration. This role is ideal for someone with deep expertise in building and managing big data pipelines using modern AWS-based technologies, and who is passionate about driving quality and efficiency in data processing systems. Key Responsibilities Data Ingestion Framework: Design & Development: Architect, develop, and maintain an end-to-end data ingestion framework that efficiently extracts, transforms, and loads data from diverse sources. Framework Optimization: Use AWS services such as AWS Glue, Lambda, EMR, ECS , EC2 and Step Functions to build highly scalable, resilient, and automated data pipelines. Data Quality & Validation: Validation Processes: Develop and implement automated data quality checks, validation routines, and error-handling mechanisms to ensure the accuracy and integrity of incoming data. Monitoring & Reporting: Establish comprehensive monitoring, logging, and alerting systems to proactively identify and resolve data quality issues. API Development: Design & Implementation: Architect and develop secure, high-performance APIs to enable seamless integration of data services with external applications and internal systems. Documentation & Best Practices: Create thorough API documentation and establish standards for API security, versioning, and performance optimization. Collaboration & Agile Practices: Cross-Functional Communication: Work closely with business stakeholders, data scientists, and operations teams to understand requirements and translate them into technical solutions. Agile Development: Participate in sprint planning, code reviews, and agile ceremonies, while contributing to continuous improvement initiatives and CI/CD pipeline development (using tools like GitLab). Required Qualifications Experience & Technical Skills: Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development. Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation. AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks. Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift. API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems. CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies. Soft Skills: Strong problem-solving abilities and attention to detail. Excellent communication and interpersonal skills with the ability to work independently and collaboratively. Capacity to quickly learn and adapt to new technologies and evolving business requirements. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Experience with additional AWS services such as Kinesis, Firehose, and SQS. Familiarity with data lakehouse architectures and modern data quality frameworks. Prior experience in a role that required proactive data quality management and API- driven integrations in complex, multi-cluster environments. Skills: ci/cd,aws,python,agile,pyspark,data engineering,data quality & validation,aws services (glue, lambda, emr, ecs, ec2, step functions),sql,data ingestion framework,data,api development Show more Show less
Posted 2 weeks ago
0 years
7 - 9 Lacs
Hyderābād
On-site
Description Join DAZN Hyderabad and help shape the future of sports streaming! We’re looking for passionate Backend Developers with strong Node.js experience to build scalable services that power our global OTT platform. Key Responsibilities Design and develop backend services using Node.js with TypeScript Build scalable APIs and microservices aligned with high availability and performance Work with cross-functional teams including frontend, DevOps, and QA Write clean, maintainable code with solid unit test coverage Ensure code quality through code reviews, CI/CD pipelines, and logging Skills, Knowledge & Expertise Backend: Node.js (ExpressJS, NestJS, or Fastify), TypeScript Databases: DynamoDB, DocumentDB, PostgreSQL, MySQL Messaging & Streaming: SNS, SQS, Kinesis Cloud: AWS (Lambda, ECS, EC2, API Gateway, RDS) Caching: Redis CI/CD: GitHub Actions Experience with alternatives like MongoDB, Kafka, RabbitMQ, or Azure/GCP is a plus What We’re Looking For Strong proficiency in Node.js and TypeScript Hands-on experience in designing RESTful APIs and microservices Exposure to AWS cloud services Familiarity with Agile development and DevOps practices Excellent problem-solving skills and ownership mindset About DAZN At DAZN, we bring ambition to life. We are innovators, game-changers and pioneers. So, if you want to push boundaries and make an impact, DAZN is the place to be. As part of our team, you'll have the opportunity to make your mark and the power to make change happen. We're doing things no-one has done before, giving fans and customers access to sport anytime, anywhere. We're using world-class technology to transform sports and revolutionise the industry and we're not going to stop. DAZN VALUES – THE ‘HOW’ IN WHAT WE DO: Agility and creativity fuel growth and innovation, to Make It Happen. Prioritising what matters drives progress and positive outcomes, Focusing On Impact. Collective ambition builds optimism and success, in order to Win As One. At DAZN, we are committed to fostering an inclusive environment that values equality and diversity, where everyone can contribute and have their voices heard. This means hiring and developing talent across all races, ethnicities, religions, age groups, sexual orientations, gender identities and abilities. Everyone has the opportunity to make change and impact our DEI journey by joining our ERGs: Proud@DAZN, Women@DAZN, Disability@DAZN and ParentZone. If you’d like to include a cover letter with your application, please feel free to. Please do not feel you need to apply with a photo or disclose any other information that is not related to your professional experience. Our aim is to make our hiring processes as accessible for everyone as possible, including providing adjustments for interviews where we can. We look forward to hearing from you.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2