Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
18 - 30 Lacs
India
Remote
Job Title: Senior Golang Backend Developer Company Type: IT Services Company Employment Type: Full-Time Location: Ahmedabad / Rajkot (Preferred) or 100% Remote (Open) Experience Required: 4+ Years (Minimum 3.5 years of hands-on experience with Golang) About The Role We are hiring a Senior Golang Backend Developer , a leading service-based tech company based in Ahmedabad . If you're a passionate backend engineer who thrives in building scalable APIs, working on microservices architecture, and deploying applications using serverless frameworks on AWS , this role is for you! This is a full-time opportunity and while we prefer candidates who can work from Ahmedabad or Rajkot , we're also open to 100% remote working for the right candidate. Key Responsibilities Design, build, and maintain RESTful APIs and backend services using Golang Develop scalable solutions using Microservices Architecture Optimize system performance, reliability, and maintainability Work with AWS Cloud Services (Lambda, SQS, SNS, S3, DynamoDB, etc.) and implement Serverless Architecture Ensure clean, maintainable code through best practices and code reviews Collaborate with cross-functional teams for smooth integration and architecture decisions Monitor, troubleshoot, and improve application performance using observability tools Implement CI/CD pipelines and participate in Agile development practices Required Skills & Experience 4+ years of total backend development experience 3.5+ years of strong, hands-on experience with Golang Proficient in designing and developing RESTful APIs Solid understanding and implementation experience of Microservices Architecture Proficient in AWS cloud services, especially: Lambda, SQS, SNS, S3, DynamoDB Experience with Serverless Architecture Familiarity with Docker, Kubernetes, GitHub Actions/GitLab CI Understanding of concurrent programming and performance optimization Experience with observability and monitoring tools (e.g., DataDog, Prometheus, New Relic, OpenTelemetry) Strong communication skills and ability to work in Agile teams Fluency in English communication is a must Nice to Have Experience with Domain-Driven Design (DDD) Familiarity with automated testing frameworks (TDD/BDD) Prior experience working in distributed remote teams Why You Should Apply Opportunity to work with modern tools and cloud-native technologies Flexibility to work remotely or from Ahmedabad/Rajkot Supportive, collaborative, and inclusive team culture Competitive salary with opportunities for growth and upskilling Skills: amazon sqs,cloud development,restful apis,observability tools,cloud services,gitlab ci,github actions,golang,agile development,behavior-driven development (bdd),s3,aws lambda,microservices,docker,go (golang),aws,microservices architecture,serverless architecture,amazon web services (aws),kubernetes,domain-driven design (ddd),backend development,cloud
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title:-senior software engineer(AWS/Java) Location : Hyderabad,pune Experience : 5+Years Job Type : Contract to hire . Notice Period :- Immediate joiners. Detailed JD: senior software engineer(AWS/Java) We are looking for talented experienced Senior software engineer with expertise in AWS cloud services, Typescript and Java development for our engineering team. Responsibilities: • Implementing cloud applications using AWS services, Typescript and Java. • Write clean, maintainable and efficient code while adhering to best practices and coding standards. • Work closely with product manager and engineers in to define and refine requirements. • Provide technical guidance and mentorship to junior engineers in team. • Troubleshoot and resolve complex technical issues and performance bottlenecks. • Create and maintain technical documentation for code and processes. • Stay up-to-date with industry trends and emerging technologies to continuously improve our development practices. Mandatory Skills: • 5+ years of software development experiences with focus on AWS cloud development and distributed applications development with Java & J2EE. • 1+ years of experience in AWS development using typescript. If not worked on typescript, willing to learn typescript because as per Principal standards typescript is the preferred language for AWS development. • Hands on experience and deploying applications on AWS cloud infrastructure(e.g., EC2, Lambda, S3, DynamoDB, RDS, API Gateway, EventBridge, SQS, SNS, Fargate etc). • Strong Hands on experience in Java/J2EE, Spring, Spring boot development and good understanding of serverless computing. • Experience with REST API and Java Shared Libraries. Good to have: • AWS Cloud practitioner, AWS Certified Developer or AWS certified solutions architect is plus. Requirements: • Strong knowledge on Java Development/Versioning Tools like IntelliJ/Git/Maven. • Installation, Configuration and Integration of tools for creating the required development environment. • Experience on handling Install failures, install updates, supporting local issues is a plus. • Understanding of application server technology. • String analytical and problem solving skills with keen attention to detail. • Excellent verbal and written communication skills with the ability to articulate complex technical concepts to various audiences. • Experience working on agile development environments and familiarity with CI/CD pipelines. • Consistently raises the bar by going beyond day-to-day performance expectations. Qualifications: • Bachelors degree in engineering and related field Seniority Level Mid-Senior level Industry IT Services and IT Consulting Employment Type Contract Job Functions Business Development Skills Amazon Web Services (AWS) Git Java Attention to Detail TypeScript Written Communication Software Development Jakarta EE Object-Oriented Programming (OOP) Back-End Web Development
Posted 2 weeks ago
7.0 years
0 Lacs
India
On-site
We are seeking a highly skilled Full Stack Developer to join our dynamic team. The ideal candidate will have strong front-end and back-end development experience with a passion for building scalable, high-quality applications. This role requires hands-on expertise in React.js, JavaScript, Python , and various AWS services , including Amplify, Lambda, Node.js, AppSync, GraphQL , and ElastiCache . Key Responsibilities: Develop and maintain full stack web applications using React.js, JavaScript, and Python. Design and implement RESTful and GraphQL APIs using AWS services like AppSync and Lambda. Integrate and manage backend services through AWS Amplify, ensuring seamless deployment and CI/CD. Work with AWS ElastiCache to optimize application performance and scalability. Collaborate with DevOps for infrastructure automation and serverless architecture best practices. Design, query, and manage data across various databases (SQL and NoSQL). Troubleshoot and debug issues across the stack. Participate in code reviews and contribute to improving code quality and team practices. Required Skills & Qualifications: Proficiency in React.js and modern JavaScript (ES6+) . Strong experience with Python for backend development. Hands-on experience with AWS services, including: Amplify Lambda (Node.js) AppSync GraphQL ElastiCache Working knowledge of Node.js as a runtime environment. Experience with databases – both relational (e.g., PostgreSQL, MySQL) and/or NoSQL (e.g., DynamoDB, MongoDB) preferred. Solid understanding of version control systems (e.g., Git). Strong problem-solving skills and ability to work independently and in a team. Preferred Qualifications: Experience with CI/CD pipelines and serverless application development. Familiarity with containerization technologies like Docker. Understanding of caching strategies and performance optimization. Experience: 7+ Years
Posted 2 weeks ago
0.0 - 1.0 years
0 - 0 Lacs
Andheri West, Mumbai, Maharashtra
On-site
Job Title: Backend Developer - MERN Stack Developer. (Only for Mumbai Candidates) Location: Andheri West, Mumbai (On-site) Employment Type: Full-Time Days Work & Time : 06 days & 10:00 Am to 07:00 Pm. Experience: 1 to 2 yrs strong backend MERN experience will be considered. Role Overview We are looking for a Backend MERN Stack Developer who specializes in building scalable, high-performance, and secure backend systems using Node.js, Express.js, and MongoDB . The ideal candidate should have deep backend knowledge and experience integrating APIs, managing databases, and deploying production-ready systems. What You'll Do Develop and maintain backend services using Node.js and Express.js Design and manage databases using MongoDB (and optionally MySQL/PostgreSQL) Build, integrate, and consume RESTful APIs (GraphQL is a bonus) Implement authentication and security (OAuth, JWT, RBAC) Optimize backend performance and troubleshoot system issues Work closely with frontend developers for seamless integration Deploy applications via CI/CD , Docker, and cloud platforms like AWS or Azure Write unit and integration tests using Jest , Mocha , etc. Required Technical Skills Languages & Frameworks: JavaScript (ES6+), Node.js, Express.js (or Nest.js) Deep understanding of async programming, event loop, and modular code Databases: MongoDB (primary), experience with PostgreSQL, MySQL, or Redis is a plus ORM tools like Mongoose or Sequelize Security: JWT, OAuth, session handling, RBAC Prevention of common vulnerabilities (XSS, CSRF, SQL injection) DevOps & Deployment: Experience with Docker, CI/CD pipelines Deployment to AWS, Azure, DigitalOcean, or Heroku Real-Time & Messaging: WebSockets / Socket.IO Message queues like Kafka, RabbitMQ, AWS SQS Bonus (Nice to Have) TypeScript Serverless experience (AWS Lambda, Google Cloud Functions) Swagger/Postman for API documentation Familiarity with NoSQL databases like DynamoDB or CouchDB Basic understanding of React.js (for integration purposes) Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹60,000.00 per year Benefits: Cell phone reimbursement Paid time off Education: Bachelor's (Required) Experience: AWS: 1 year (Required) Express.js: 1 year (Required) Node.js: 1 year (Required) MongoDB: 1 year (Required) Docker: 1 year (Required) avaScript (ES6+), Node.js, Express.js (or Nest.js): 1 year (Required) Implement authentication and security (OAuth, JWT, RBAC): 1 year (Required) RESTful APIs: 1 year (Required) Serverless experience (AWS Lambda, Google Cloud Functions): 1 year (Required) Swagger/Postman for API documentation: 1 year (Required) Location: Andheri West, Mumbai, Maharashtra (Required) Work Location: In person Speak with the employer +91 9220904193
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- Bachelor's in Computer Science, or a related field. - 6+ years of non-internship professional software development experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems - Familiarity with SQL and NoSQL databases with deep expertise using Python. - Experience with distributed version control such as Git and basic knowledge of Linux environments AWS Infrastructure Services owns the design, planning, delivery, and operation of all AWS global infrastructure. In other words, we’re the people who keep the cloud running. We support all AWS data centers and all of the servers, storage, networking, power, and cooling equipment that ensure our customers have continual access to the innovation they rely on. We work on the most challenging problems, with thousands of variables impacting the supply chain — and we’re looking for talented people who want to help. You’ll join a diverse team of software, hardware, and network engineers, supply chain specialists, security experts, operations managers, and other vital roles. You’ll collaborate with people across AWS to help us deliver the highest standards for safety and security while providing seemingly infinite capacity at the lowest possible cost for our customers. And you’ll experience an inclusive culture that welcomes bold ideas and empowers you to own them to completion. AIS is seeking a highly motivated and passionate Back End Data Engineer who is responsible for designing, developing, testing, and deploying Supply Chain Application and Process Automation. In this role you will collaborate with business leaders, work backwards from customers, identify problems, propose innovative solutions, relentlessly raise standards, and have a positive impact on AWS Infrastructure Supply Chain & Procurement. In this, you will work closely with a team of Business Intelligence Engineers and Data Scientists to architect the application programming interface (API) and user Interface (UI) in context with the business outcomes. You will be using the best of available tools, including EC2, Lambda, DynamoDB, and Elastic Search. You will be responsible for the full software development life cycle to build scalable application and deploy in AWS Cloud. Key job responsibilities In this job, you will: • Work with business leaders, Business Intelligence Engineers, and Data Scientists to ideate business friendly software solutions. • Design client-side and server-side architecture. • Develop visually appealing front end website architecture, including translating designer mock-ups and wireframes into front-end code. • Develop functional databases, applications, and servers to support websites on the back end. • Write effective APIs. • Test software to ensure responsiveness and efficiency. • Troubleshoot, debug and upgrade applications. • Create security and data protection settings. • Build features and applications with a mobile responsive design. • Develop technical specifications and write technical documentation. About the team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship and Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Masters degree or higher in Computer Science or related field Experience working in Cloud Technologies Experience in CI/CD Pipeline for Code deployment. Knowledge of AWS Infrastructure Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience with forecasting and statistical analysis - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Amazon’s ROW (Rest of World) Supply Chain Analytics team is looking for talented Business Intelligence Engineers who develop solutions to better manage/optimize speed and operations planning while providing the best experience to our customers at the lowest possible price. Our team members have an opportunity to be at the forefront of supply chain thought leadership by working on some of the most difficult problems with some of the best research scientists, product/program managers, software developers and business leaders in the industry, shaping our roadmap to drive real impact on Amazon's long-term profitability. We are an agile team, building new analysis from ground up, proposing new concepts and technology to meet business needs, and enjoy and excel at diving into data to analyze root causes and implement long-term solutions. As a BIE within the group, you will analyze massive data sets, identify areas to improve, define metrics to measure and monitor programs, build models to predict and optimize and most importantly work with different stakeholders to drive improvements over time. You will also work closely with internal business teams to extract or mine information from our existing systems to create new analysis, build analytical products and cause impact across wider teams in intuitive ways. This position provides opportunities to influence high visibility/high impact areas in the organization. They are right a lot, work very efficiently, and routinely deliver results on time. They have a global view of the analytical and/or science solutions that they build and consistently think in terms of automating, expanding, and scaling the results broadly. This position also requires you to work across a variety of teams, including transportation, operations, finance, delivery experience, people experience and platform (software) teams. Successful candidates must thrive in fast-paced environments which encourage collaborative and creative problem solving, be able to measure and estimate risks, constructively critique peer research, extract and manipulate data across various data marts, and align research focuses on Amazon’s strategic needs. We are looking for people with a flair for recognizing trends and patterns while correlating it to the business problem at hand. If you have an uncanny ability to decipher the exact policy/mechanism/solution to address the challenge and ability to influence folks using hard data (and some tact) then we are looking for you! Key job responsibilities • Analysis of historical data to identify trends and support decision making, including written and verbal presentation of results and recommendations • Collaborating with product and software development teams to implement analytics systems and data structures to support large-scale data analysis and delivery of analytical and machine learning models • Mining and manipulating data from database tables, simulation results, and log files • Identifying data needs and driving data quality improvement projects • Understanding the broad range of Amazon’s data resources, which to use, how, and when • Thought leadership on data mining and analysis • Modeling complex/abstract problems and discovering insights and developing solutions/products using statistics, data mining, science/machine-learning and visualization techniques • Helping to automate processes by developing deep-dive tools, metrics, and dashboards to communicate insights to the business teams • Collaborating effectively with internal end-users, cross-functional software development teams, and technical support/sustaining engineering teams to solve problems and implement new solutions Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
10.0 years
0 Lacs
Sydney, New South Wales, Australia
On-site
Red Prairie Warehouse Management Support Engineer Ideally the applicant would have 10+ years experience. 6 – 10 years’ experience as Configuration Analyst and Developer in Red Prairie WMS/JDA Dispatcher • Gather Business requirement and design Technical design document. • Configure RP/JDA Dispatcher WMS based on the design and Business needs. • Customize and develop complex PL/SQL queries, Stored Procedures, Triggers, tables, views based on the requirement. • Developing UNIX shell script to support WMS development and customization. • Implement the configuration, code for the Site/Customer in Dev, Testing and Production environment. • Perform UNIT testing of the configuration and code in RP/JDA WMS. • Support Integration testing, UAT, Production Go live, Warranty Support, fixing bugs/defects and produce corresponding deliverables. • Engaging with high profile clients and challenging business requirements. • Excellent technical and functional communication skills. • Strong interpersonal skills and client communication. We are looking for a full stack engineer to augment our agile team of 4 engineers. The candidate would have experience in the languages: Node.js, , Angular (15) , Java They’d have worked with Microservices and be familiar with AWS capabilities like Lambda, SNS, SQS as well as PostgreSQL/DynamoDB storage. They’d also be familiar with automated test frameworks like Cypress. We work in the office Tuesday to Thursday every week and from home on Monday and Friday. • Competencies Required Red Prairie WMS, JDA Dispatcher UNIX, LINUX PL/SQL, Oracle 11g • Must Have Should have performed minimum 2 warehouse/Client implementations. Proficient in RP/JDA Dispatcher WMS architecture. Proficient knowledge in configuring Clustering, Putaway/Allocation Algorithms, RDT rules, Data Programs, Locations, SKU configuration, Merge rules, Purging, Function access, System profile, iReports and Label design etc… Proficient in writing PL/SQL queries, complex stored procedures and triggers. Good Knowledge in UNIX/LINUX, Shell scripting, OS commands, directory structure, Files, Permissions, File editors etc… Good knowledge in Middleware technology (Ex: webMethods etc…) Good analytical and problem solving skills. Strong communication skills and willingness to learn. • Nice to Have Experience in JDA Integrator configuration and development. Experience in tracking tools like JIRA, Confluence and service management tools like Service Now, HP SM9 etc (Nice to have) React / React Native,
Posted 2 weeks ago
35.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : SDE 2 Full Stack Developer Responsibilities Lead design and delivery of complex end-to-end features across frontend, backend, and data layers. Make strategic architectural decisions on frameworks, datastores, and performance patterns. Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns. Build and maintain shared UI component libraries and backend service frameworks for team reuse. Identify and eliminate performance bottlenecks in both browser rendering and server throughput. Instrument services with metrics and logging, driving SLIs, SLAs, and observability. Define and enforce comprehensive testing strategies : unit, integration, and end-to-end. Own CI/CD pipelines, automating builds, deployments, and rollback procedures. Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices. Partner with Product, UX, and Ops to translate business objectives into technical roadmaps. Facilitate sprint planning, estimation, and retrospectives for predictable deliveries. Mentor and guide SDE-1s and interns; participate in hiring. Qualifications & Skills 35 years building production Full stack applications end-to-end with measurable impact. Proven leadership in Agile/Scrum environments with a passion for continuous learning. Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies. Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot). Expert in designing RESTful and GraphQL APIs and scalable database schemas. Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis). Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, apigateway etc. Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright). Frontend profiling (Lighthouse) and backend tracing for performance tuning. Secure coding : OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes. Strong communicator able to convey technical trade-offs to non-technical stakeholders. Experience in reviewing pull requests and providing constructive feedback to the team. Qualities We'd Love To Find In You The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality software. Strong collaboration abilities and a flexible & friendly approach to working with teams. Strong determination with a constant eye on solutions. Creative ideas with problem solving mind-set. Be open to receiving objective criticism and improving upon it. Eagerness to learn and zeal to grow. Strong communication skills is a huge plus. Work Location : Hyderabad (ref:hirist.tech)
Posted 2 weeks ago
5.0 years
0 Lacs
Andhra Pradesh, India
Remote
Submit Profile Experience 5 Years - 6 Years Duration 3-6 month Work Style Remote Requirements Strong expertise in Core Java and Java EE technologies. Hands-on experience with AWS services (EC2, S3, Lambda, CloudWatch, etc.) Good working knowledge of Jetty server (configuration, deployment, tuning). Experience in developing REST APIs and Microservices architecture. Knowledge of Spring Boot, Spring MVC, and Hibernate. Proficient in SQL/NoSQL databases (MySQL, DynamoDB, etc.) Hands-on experience with Git, Maven, and CI/CD pipelines. Experience with Agile methodologies and CI/CD tools is a bonus What you’ll do: Develop and maintain scalable web applications and RESTful APIs using Java with Jetty and Spring Boot experience. Design, implement, and optimize database schemas and queries with JPA/Hibernate. Collaborate with cross-functional teams to define, design, and release new features. AWS cloud experience is mandatory. Write clean, maintainable, and efficient code and ensure code quality through unit testing and code reviews. Troubleshoot and resolve issues and perform performance tuning for applications in production.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a passionate and experienced Software Engineer, you have the exciting opportunity to join the API & Exports team at Meltwater. This team plays a crucial role in enabling programmatic access to data, handling thousands of exports daily, enhancing API usability, and managing API integrations and performance at scale. Your work will involve expanding and optimizing export functionalities, developing scalable APIs, and implementing robust monitoring and management practices. Join this high-impact team located at the core of the data delivery platform. Your responsibilities will include owning the design, development, and optimization of API and export features, collaborating with product managers and senior engineers to define functionality and scale, enhancing developer experience by simplifying API consumption and integration, participating in building export pipelines, streaming architectures, and webhook integrations, maintaining high observability and reliability standards using tools like Coralogix, CloudWatch, and Grafana, as well as contributing to on-call rotations and incident response for owned services. To excel in this role, you should bring at least 5 years of software engineering experience with a strong focus on Golang (preferred), Java, or C++, experience in designing and developing RESTful APIs, familiarity with cloud-native applications (preferably AWS), a good understanding of microservice architecture and backend design principles, solid knowledge of Postgres, Redis, and ideally DynamoDB. It would be advantageous if you also have experience with asynchronous or event-driven architectures using tools like SQS, SNS, or webhooks, exposure to DevOps workflows and tools such as Terraform, Docker, Kubernetes, etc., experience working with data exports, reporting systems, or data streaming, expertise in improving developer experience around APIs (e.g., OpenAPI, Swagger, static site generators), familiarity with JWT authentication, API gateways, and rate limiting strategies, experience in accessibility and compliance standards for APIs and data handling, and proficiency with observability tools and practices. Meltwater's tech stack includes languages like Golang and some JavaScript/TypeScript, infrastructure on AWS with services like S3, Lambda, SQS, SNS, CloudFront, and Kubernetes (Helm), databases such as Postgres, Redis, DynamoDB, monitoring tools like Coralogix, Grafana, OpenSearch, and CI/CD & IaC practices through GitHub Actions and Terraform. Joining Meltwater offers you flexible paid time off options, comprehensive health insurance, employee assistance programs, a CalmApp subscription, a hybrid work style, a family leave program, inclusive community, ongoing professional development opportunities, and a vibrant work environment in Hitec city, Hyderabad. Embrace the opportunity to make a difference, learn, grow, and succeed in a diverse and innovative environment at Meltwater.,
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities: - Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Overview Cvent is a leading meetings, events and hospitality technology provider with more than 4,800 employees and nearly 22,000 customers worldwide. Founded in 1999, the company delivers a comprehensive event marketing and management platform for event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. The DNA of Cvent is our people, and our culture has an emphasis on fostering intrapreneurship --a system that encourages Cventers to think and act like individual entrepreneurs and empowers them to take action, embrace risk, and make decisions as if they had founded the company themselves. We foster an environment that promotes agility, which means we don’t have the luxury to wait for perfection. At Cvent, we value the diverse perspectives that each individual brings. Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections. In This Role, You Will Work on Internet scale applications, where performance, reliability, scalability and security are critical design goals - not after-thoughts Create beautiful, interactive and easy-to-use web applications using rich client-side code and MVC based server-side code. Learn the nuts and bolts of Microservices Architecture, Service-Oriented Architecture (SOA) and Event-Driven Architecture (EDA) in real-life applications. Gain experience with different database technologies, ranging from traditional relational to the latest NoSQL products such as Couchbase, AWS DynamoDB. Collaborate with some of the best engineers in the industry to work on complex Software as a Service (SaaS) based applications Here's What You Need The prerequisites for joining our development team are simple. We care more about your attitude and aptitude than the specific tools and technologies you have used in the past. You need to have a strong passion for software development and must take pride in your designing, architecture, and coding. You should also have great analytical skills and ability to handle complex, modular software development in a collaborative team-based environment. In addition to this, you will have a leg up if you also meet the following criteria: Primary Skills 7 to 9 years of hands-on programming experience with Java including Object Oriented Design Experience with RESTful Architecture for Web Services and API development using Spring/Dropwizard or any other framework Experience in contributing to the architecture and design (design patterns, Non-Functional Requirements (NFRs) including Performance, Scalability, Reliability, Security) Experience with one or more of the following databases: SQL Server, MySQL, PostgreSQL, Oracle, Couchbase, Cassandra, AWS DynamoDB or other NoSQL technologies Experience of working with Queuing technologies such as RabbitMQ/Kafka/Active MQ Bachelor’s degree (or higher) in Computer Science OR related technical discipline Strong understanding of Computer Science fundamentals, including problem solving Excellent verbal and written communication skills along with strong interpersonal skills Excellent troubleshooting skills Proven ability to work in a fast paced, agile environment and result oriented culture Strong influence in the team technical discussion and building team’s technical vision Preferred Skills Experience in full stack development ranging from front-end user interfaces to backend systems Experience/knowledge into JavaScript + Angular/React Js/Typescript, Graph Query Language (GQL) Experience of working with Elasticsearch/Solr Experience working with AWS / GCP / Azure Cloud
Posted 2 weeks ago
8.0 years
0 Lacs
Tamil Nadu, India
On-site
Job Title: Data Engineer About VXI VXI Global Solutions is a BPO leader in customer service, customer experience, and digital solutions. Founded in 1998, the company has 40,000 employees in more than 40 locations in North America, Asia, Europe, and the Caribbean. We deliver omnichannel and multilingual support, software development, quality assurance, CX advisory, and automation & process excellence to the world’s most respected brands. VXI is one of the fastest growing, privately held business services organizations in the United States and the Philippines, and one of the few US-based customer care organizations in China. VXI is also backed by private equity investor Bain Capital. Our initial partnership ran from 2012 to 2016 and was the beginning of prosperous times for the company. During this period, not only did VXI expand our footprint in the US and Philippines, but we also gained ground in the Chinese and Central American markets. We also acquired Symbio, expanding our global technology services offering and enhancing our competitive position. In 2022, Bain Capital re-invested in the organization after completing a buy-out from Carlyle. This is a rare occurrence in the private equity space and shows the level of performance VXI delivers for our clients, employees, and shareholders. With this recent investment, VXI has started on a transformation to radically improve the CX experience though an industry leading generative AI product portfolio that spans hiring, training, customer contact, and feedback. Job Description: We are seeking talented and motivated Data Engineers to join our dynamic team and contribute to our mission of harnessing the power of data to drive growth and success. As a Data Engineer at VXI Global Solutions, you will play a critical role in designing, implementing, and maintaining our data infrastructure to support our customer experience and management initiatives. You will collaborate with cross-functional teams to understand business requirements, architect scalable data solutions, and ensure data quality and integrity. This is an exciting opportunity to work with cutting-edge technologies and shape the future of data-driven decision-making at VXI Global Solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and store data from various sources. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data models and schemas to support analytics, reporting, and machine learning initiatives. Optimize data processing and storage solutions for performance, scalability, and cost-effectiveness. Ensure data quality and integrity by implementing data validation, monitoring, and error handling mechanisms. Collaborate with data analysts and data scientists to provide them with clean, reliable, and accessible data for analysis and modeling. Stay current with emerging technologies and best practices in data engineering and recommend innovative solutions to enhance our data capabilities. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven 8+ years' experience as a data engineer or similar role Proficiency in SQL, Python, and/or other programming languages for data processing and manipulation. Experience with relational and NoSQL databases (e.g., SQL Server, MySQL, Postgres, Cassandra, DynamoDB, MongoDB, Oracle), data warehousing (e.g., Vertica, Teradata, Oracle Exadata, SAP Hana), and data modeling concepts. Strong understanding of distributed computing frameworks (e.g., Apache Spark, Apache Flink, Apache Storm) and cloud-based data platforms (e.g., AWS Redshift, Azure, Google BigQuery, Snowflake) Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker, Apache Superset) and data pipeline tools (e.g. Airflow, Kafka, Data Flow, Cloud Data Fusion, Airbyte, Informatica, Talend) is a plus. Understanding of data and query optimization, query profiling, and query performance monitoring tools and techniques. Solid understanding of ETL/ELT processes, data validation, and data security best practices Experience in version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Join VXI Global Solutions and be part of a dynamic team dedicated to driving innovation and delivering exceptional customer experiences. Apply now to embark on a rewarding career in data engineering with us!
Posted 2 weeks ago
5.0 - 7.0 years
27 - 37 Lacs
Bengaluru
Hybrid
One of our prestigious client is looking for to hire candidates for the below position Position Type : Sr. Data Scientist Years of Experience: 5 - 7 Years Number of positions: 2 nos Salary / CTC : upto 38 LPA (based on your current salary and on the interview evaluation) Job Summary Responsibilities: Help design and build the next iteration of process automation in Core Map processes employing a highly scalable Big Data infrastructure and machine learning as applied to global-scale digital map-making. Build and test analytic and statistical models to improve a wide variety of both internal data-driven processes for map-making data decisions and system control needs. Act as an expert and evangelist in the areas of data mining, machine learning, statistics, and predictive analysis and modeling. Requirements: MS or PhD in a discipline such as Statistics, Applied Mathematics, Computer Science, or Econometrics with an emphasis or thesis work on one or more of the following: computational statistics/science/engineering, data mining, machine learning, and optimization. Minimum of 5 years related, professional experience. Knowledge of data mining and analytic methods such as regression, classifiers, clustering, association rules, decision trees, Bayesian network analysis, etc. We should have expert-level knowledge in one or more of these areas. Knowledge of Computer Vision, Deep Learning and Point Cloud Processing algorithms. Proficiency with a statistical analysis package and associated scripting language such as Python, R, Matlab, SAS, etc. Programming experience with SQL, shell script, Python, etc. Knowledge of and ideally some experience with MLOps will be preferred. Knowledge of and ideally some experience with tools such as Pig, Hive, etc., for working with big data in Hadoop and/or Spark for data extraction and data prep for analysis. Experience with and demonstrated capability to effectively interact with both internal and external customer executives, technical and non-technical to explain uses and value of predictive systems and techniques. Demonstrated proficiency with understanding, specifying and explaining predictive modeling solutions and organizing teams of other data scientists and engineers to execute projects delivering those solutions. Preferred Qualifications: Development experience with Java and Scala Development experience with Docker Development experience with GIS data Development experience with NoSQL (i.e. DynamoDB) Knowledge of GPU programming (CUDA or OpenCL) on GPU accelerator architecture ================================================================= Please provide us the following information. along with your updated resume: Your Total Experience: Your Relevant Development Experience in Java: Your Relevant Development Experience in Scala programming: Your Relevant Development Experience with Docker: Your Relevant Development Experience with GIS data: Your Relevant Development Experience with NoSQL (i.e. DynamoDB): Your Relevant Experience with GPU programming (CUDA or OpenCL) on GPU accelerator architecture: Your latest Education with year of passing/percentage: Your Certification (any): If yes, Provide the your code or share the Certication copy: Your Notice Period: Is Buyout option available (yes/no): if yes, Do mention the Buyout notice period amount: Your Current Location: Your Preferred Location: Work from Office / Hybrid: Your Current Salary : Your Expected Salary : Any active offer (yes/no) : If yes, do mention your current offered salary details / offered company name : Your Preferred Interview Date / Time:
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are the only professional services organization who has a separate business dedicated exclusively to the financial services marketplace. Join Digital Engineering Team and you will work with multi-disciplinary teams from around the world to deliver a global perspective. Aligned to key industry groups including Asset management, Banking and Capital Markets, Insurance and Private Equity, Health, Government, Power and Utilities, we provide integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning and individually tailored coaching you will experience ongoing professional development. That’s how we develop outstanding leaders who team to deliver on our promises to all of our stakeholders, and in so doing, play a critical role in building a better working world for our people, for our clients and for our communities. Sound interesting? Well, this is just the beginning. Because whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. We’re seeking a versatile Full Stack Developer with hands-on experience in Python (including multithreading and popular libraries) ,GenAI and AWS cloud services. The ideal candidate should be proficient in backend development using NodeJS, ExpressJS, Python Flask/FastAPI, and RESTful API design. On the frontend, strong skills in Angular, ReactJS, TypeScript, etc.EY Digital Engineering is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional capability and product knowledge. The Digital Engineering (DE) practice works with clients to analyse, formulate, design, mobilize and drive digital transformation initiatives. We advise clients on their most pressing digital challenges and opportunities surround business strategy, customer, growth, profit optimization, innovation, technology strategy, and digital transformation. We also have a unique ability to help our clients translate strategy into actionable technical design, and transformation planning/mobilization. Through our unique combination of competencies and solutions, EY’s DE team helps our clients sustain competitive advantage and profitability by developing strategies to stay ahead of the rapid pace of change and disruption and supporting the execution of complex transformations. Your Key Responsibilities Application Development: Design and develop cloud-native applications and services using AWS services such as Lambda, API Gateway, ECS, EKS, and DynamoDB, Glue, Redshift, EMR. Deployment and Automation: Implement CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy to automate application deployment and updates. Architecture Design: Collaborate with architects and other engineers to design scalable and secure application architectures on AWS. Performance Tuning: Monitor application performance and implement optimizations to enhance reliability, scalability, and efficiency. Security: Implement security best practices for AWS applications, including identity and access management (IAM), encryption, and secure coding practices. Container Services Management: Design and deploy containerized applications using AWS services such as Amazon ECS (Elastic Container Service), Amazon EKS (Elastic Kubernetes Service), and AWS Fargate. Configure and manage container orchestration, scaling, and deployment strategies. Optimize container performance and resource utilization by tuning settings and configurations. Application Observability: Implement and manage application observability tools such as AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack (Elasticsearch, Logstash, Kibana). Develop and configure monitoring, logging, and alerting systems to provide insights into application performance and health. Create dashboards and reports to visualize application metrics and logs for proactive monitoring and troubleshooting. Integration: Integrate AWS services with application components and external systems, ensuring smooth and efficient data flow. Troubleshooting: Diagnose and resolve issues related to application performance, availability, and reliability. Documentation: Create and maintain comprehensive documentation for application design, deployment processes, and configuration. Skills And Attributes For Success Required Skills: AWS Services: Proficiency in AWS services such as Lambda, API Gateway, ECS, EKS, DynamoDB, S3, and RDS, Glue, Redshift, EMR. Backend: Python (multithreading, Flask, FastAPI), NodeJS, ExpressJS, REST APIs Frontend: Angular, ReactJS, TypeScript Cloud Engineering : Development with AWS (Lambda, EC2, S3, API Gateway, DynamoDB), Docker, Git, etc. Proven experience in developing and deploying AI solutions with Python, JavaScript Strong background in machine learning, deep learning, and data modelling. Good to have: CI/CD pipelines, full-stack architecture, unit testing, API integration Security: Understanding of AWS security best practices, including IAM, KMS, and encryption. Observability Tools: Proficiency in using observability tools like AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack. Container Orchestration: Knowledge of container orchestration concepts and tools, including Kubernetes and Docker Swarm. Monitoring: Experience with monitoring and logging tools such as AWS CloudWatch, CloudTrail, or ELK Stack. Collaboration: Strong teamwork and communication skills with the ability to work effectively with cross-functional teams. Preferred Qualifications: Certifications: AWS Certified Solutions Architect – Associate or Professional, AWS Certified Developer – Associate, or similar certifications. Experience: At least 5 Years of experience in an application engineering role with a focus on AWS technologies. Agile Methodologies: Familiarity with Agile development practices and methodologies. Problem-Solving: Strong analytical skills with the ability to troubleshoot and resolve complex issues. Education: Degree: Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field, or equivalent practical experience What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Seller Flex team located in Bangalore is looking for a SDE to deliver strategic goals for Amazon eCommerce systems. This is an opportunity to join our mission to build tech solutions that empower sellers to delight the next billion customers. You will be responsible for building new system capabilities grounds up for strategic business initiatives. If you feel excited by the challenge of setting the course for large company wide initiatives and building and launching customer facing products in international locales, this may be the next big career move for you. We are building systems which can scale across multiple marketplaces and automated large scale eCommerce business. We are looking for a SDE1 to design and build our tech stack as a coherent architecture and deliver capabilities across marketplaces. We operate in a high performance co-located agile ecosystem where SDEs, Product Managers and Principals frequently connect with end customers of our products. Our SDEs stay connected with customers through seller/FC/Delivery Station visits and customer anecdotes. This allows our engineers to significantly influence product roadmap, contribute to PRFAQs and create disproportionate impact through the tech they deliver. We offer Technology leaders a once in a lifetime opportunity to transform billions of lives across the planet through their tech innovations. In this role, you will have front row seats to how we are disrupting eCommerce fulfilment and supply chain by offering creative solutions to yet – unsolved problems. We operate like a start-up within the Amazon ecosystem and have proven track record of delivering inventions that work globally. You will be challenged to look at the world through the eyes of our seller customers and think outside the box to build new tech solutions to make our Sellers successful. You will often find yourself building products and services that are new to Amazon and will have an opportunity to pioneer not just the technology components but the idea itself across other Amazon teams and markets. See Below For a Couple Of Anecdotes, Should You Want To Hear a SDEʼs Perspective On What It Is Like To Work In This Team “I have worked on other global tech platforms at Amazon prior to SellerFlex and what I find extremely different and satisfying here is that in addition to the scale and complexity of work that I do and the customer impact it has, I am part of a team that makes SDEs owners of critical aspects of team functioning – whether it be designing and running engineering excellence programs for design reviews, COE, CR, MCM and Service launch bar raisers or the operational programs for the team. This has allowed me to develop myself not just on the tech or domain as a SDE but also as a wholesome Amazon tech leader for future challenges.” “It is extremely empowering to be a part of this team where I am challenged to learn and innovate in every project that I work on. I get to work across the tech stack and have end to end ownership of solution and tech choices. I hadnʼt worked on as many services in my previous team at Amazon as I have built from scratch, launched and scaled in this team . The team is in a great place where it is connected to customers closely, is building new stuff from scratch and has to deal with very light Ops burden due to the great architecture and design choices that are being made by SDEs” KEY REPONSIBILITIES Work closely with senior and principal engineers to architect and deliver high quality technology solutions Own development in multiple layers of the stack including distributed workflows hosted in native AWS architecture Operational rigor for a rapidly growing tech stack Contribute to patents, tech talks and innovation drives Assist in the continual hiring and development of technical talent Measure success metrics and influence evolution of the tech product Loop Competencies Basic Qualifications Bachelorʼs degree or higher in Computer Science and 1+ years of Software Development experience Proven track record of building large-scale, highly available, low latency, high quality distributed systems and software products Possess an extremely sound understanding of basic areas of Computer Science such as Algorithms, Data Structures, Object Oriented Design, Databases. Good understanding of AWS services such as EC2, S3, DynamoDB, Elasticsearch, Lambda, API Gateway, ECR, ECS, Lex etc. Excellent coding skills in an object oriented language such as Java and Scala Great problem solving skills and propensity to learn and develop tech talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of computer science fundamentals (object-oriented design, data structures, algorithm design, problem solving and complexity analysis) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2980587
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to create the next-generation tools for intuitive data access for transportation operations? We are looking for a Business Intelligence Engineer to help setup and deliver robust, structured reporting, analytics and models for the RBS Cost to Serve team. You will be a key contributor to shaping our strategic Defect Elimination program by equipping the program teams with the key analytics and insights. You will have an eye for detail, proficient/advanced SQL/DW/Python and have a knack for solving challenging data and reporting challenges. The role requires you to feel comfortable working with and clearly communicating with other functional teams, regionally and globally. The position will be based in Bangalore/Chennai/HYD. You will be reporting to a Sr Program Manager : Cost to Serve Analytics & Insights, working intensely with her (larger) project team, including Finance. The ideal candidate will be comfortable in a fast-paced, dynamic environment; will be a creative and an analytical problem solver with the opportunity to fulfill the Amazon motto to “Work Hard. Have Fun. Make History”. Key job responsibilities Analysis of business requirements and translation into technical requirements. By support of senior colleagues integration into a working, stable and scalable system Independent realization of requirements for Business Intelligence and custom software development products Creation of test cases and guidance of business stakeholders within the testing process Presentation of solutions and implemented features within weekly sync up with business stakeholders Ownership of maintenance and error handling of deployed solutions Focus on project delivery About The Team RBS Cost to Serve team aims to identify and eliminate waste, negative experiences, and non-value activities across the end-to-end remit of supply chain and dependent work streams that slow down resolution for our stakeholders. The primary objective is to reduce Cost To Serve for Amazon and enable “Free Cash Flow” by optimizing the Cost per shipped unit economics across the supply chain systems through Defect Elimination. Our program will support in establishing the end-to-end supply chain checkpoints on how the inventory moves inside Amazon to identify gaps, broken processes/policies to eliminate root causes of systemic difficulties rather than merely addressing symptoms, on behalf of our customers. This team will partner with internal/external stakeholders to establish the Cost to serve charter based on opportunity size and own specific unique initiatives that are beyond the existing team’s program scope. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - HYD 15 SEZ - E55 Job ID: A2999431
Posted 2 weeks ago
3.0 years
6 - 8 Lacs
Hyderābād
On-site
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Rest of World (ROW) Transportation Execution team in Hyderabad is looking for an innovative, hands-on and customer-obsessed BIE for its Analytics function. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
5.0 years
8 Lacs
Hyderābād
On-site
We are looking for a seasoned DevOps Engineer with 5+ years of experience in AWS, CI/CD tools, and infrastructure automation. The ideal candidate should have a solid background in Linux/Windows administration, scripting (Bash, Python, PowerShell).Strong troubleshooting skills and a passion for automation are essential. Bachelor’s Degree in Computer Science, Information Systems, or related field required. 5+ years of proven experience with Linux Administration, Windows Administration, and IIS Management. Deep understanding of IIS concepts and settings. 5+ years of experience with Amazon Web Services (AWS) – EC2, S3, CloudFormation, CLI, Lambda, SQS, DynamoDb Deep knowledge of multiple monitoring tools and how to mine them for advanced data Proficient in: Bash, Chef, Powershell, Python, XML, Web concepts such as REST APIs and SPA, HTTP Get and Post Demonstrated mastery of Development tools and methodologies, such as Firewall concepts, network connectivity, XML config files, Rest API calls, HTTP Headers and Response Codes Deep experience and knowlegde with Continuous Integration/Deployment Technologies: Atlassian Bamboo, Octopus Deploy, MSBuild, NUnit, GIT, Maven, Docker. Must have passion for technology and focused on automation. For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice
Posted 2 weeks ago
3.0 years
6 - 8 Lacs
Hyderābād
On-site
DESCRIPTION Rest of World (ROW) Transportation Execution team in Hyderabad is looking for an innovative, hands-on and customer-obsessed BIE for its Analytics function. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Supply Chain/Transportation Management
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderābād
On-site
Job Title: Backend Developer Location: India (Hybrid) Travel Requirement: Frequent travel to Abu Dhabi for 2–3 months (as per project needs) Experience: 5 to 8 Years Employment Type: Full-Time About the Role: We are seeking a skilled and experienced Backend Developer with 5 to 8 years of hands-on experience in designing and implementing robust backend systems. The ideal candidate should be based in Egypt , available for a hybrid work model , and flexible to travel to Abu Dhabi for on-site client engagements lasting 2 to 3 months periodically. You will play a key role in building scalable APIs, managing cloud-based infrastructure, and working with various database systems in a microservices-driven environment. Key Responsibilities: Design, develop, and maintain robust and scalable RESTful APIs for enterprise-level applications. Build and deploy microservices using modern backend frameworks with a strong emphasis on performance and maintainability. Implement containerized applications using Docker and Kubernetes . Write unit and integration tests using relevant testing frameworks to ensure code quality and system reliability. Collaborate with cross-functional teams including front-end developers, DevOps, and data engineers to deliver high-quality software solutions. Optimize backend performance and ensure security compliance throughout the development lifecycle. Maintain and work with various database technologies including SQL , NoSQL , and Vector databases . Deploy, monitor, and troubleshoot applications in cloud environments (compute, serverless, networking, storage). Mandatory Qualifications & Skills: Software Engineering: Proven experience in backend development with Python (preferred) or Node.js . Strong understanding of microservices architecture and RESTful API design . Experience with containerization tools like Docker and orchestration with Kubernetes . Knowledge of testing frameworks (e.g., PyTest, Mocha, Jest) and test-driven development practices. Databases: Hands-on experience with SQL , NoSQL (e.g., MongoDB, DynamoDB), and Vector databases (e.g., Pinecone, FAISS). Cloud: Experience deploying and managing applications on cloud platforms (AWS, Azure, or GCP), including compute services, serverless functions, cloud databases , and networking components . Preferred Skills (Nice to Have): Experience integrating machine learning models into production-grade applications. Understanding of secure coding practices and application-level security standards. Familiarity with event-driven architectures and message brokers (e.g., Kafka, RabbitMQ). Other Requirements: Travel Flexibility: Willingness and ability to travel to Abu Dhabi for on-site work for durations of 2 to 3 months when required. Strong problem-solving skills, communication abilities, and a collaborative mindset. Self-motivated and able to work independently in a distributed team environment. 9kTiplTFMi
Posted 2 weeks ago
1.0 - 2.0 years
1 - 5 Lacs
Gurgaon
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company and a leader in the convenience store and fuel space with over 16,700 stores. It has footprints across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team, and the Associate ML Ops Analyst will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About the role The incumbent will be responsible for implementing Azure data services to deliver scalable and sustainable solutions, build model deployment and monitor pipelines to meet business needs. Roles & Responsibilities Development and Integration Collaborate with data scientists to deploy ML models into production environments Implement and maintain CI/CD pipelines for machine learning workflows Use version control tools (e.g., Git) and ML lifecycle management tools (e.g., MLflow) for model tracking, versioning, and management. Design, build as well as optimize applications containerization and orchestration with Docker and Kubernetes and cloud platforms like AWS or Azure Automation & Monitoring Automating pipelines using understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Implement model monitoring and alerting systems to track model performance, accuracy, and data drift in production environments. Collaboration and Communication Work closely with data scientists to ensure that models are production-ready Collaborate with Data Engineering and Tech teams to ensure infrastructure is optimized for scaling ML applications. Optimization and Scaling Optimize ML pipelines for performance and cost-effectiveness Operational Excellence Help the Data teams leverage best practices to implement Enterprise level solutions. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Helping to define common coding standards and model monitoring performance best practices Continuously evaluate the latest packages and frameworks in the ML ecosystem Build automated model deployment data engineering pipelines from plain Python/PySpark mode Stakeholder Engagement Collaborate with Data Scientists, Data Engineers, cloud platform and application engineers to create and implement cloud policies and governance for ML model life cycle. Job Requirements Education & Relevant Experience Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) 1-2 years of relevant working experience in MLOps Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Knowledge of core computer science concepts such as common data structures and algorithms, OOPs Programming languages (R, Python, PySpark, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Exposure to ETL tools and version controlling Experience in building and maintaining CI/CD pipelines for ML models. Understanding of machine-learning, information retrieval or recommendation systems Familiarity with DevOps tools (Docker, Kubernetes, Jenkins, GitLab). #LI-DS1
Posted 2 weeks ago
3.0 - 4.0 years
2 - 6 Lacs
Gurgaon
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space, it has footprint across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team, and the Data Scientist/Senior Data Scientist will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. ___________________________________________________________________________________________________________ Department: Data & Analytics Location: Cyber Hub, Gurugram, Haryana (5 days in office) Job Type: Permanent, Full-Time (40 Hours) Reports To: Senior Manager Data Science & Analytics ____________________________________________________________________________________________________________ About the role The incumbent will be responsible for delivering advanced analytics projects that drive business results including interpreting business, selecting the appropriate methodology, data cleaning, exploratory data analysis, model building, and creation of polished deliverables. Roles & Responsibilities Analytics & Strategy Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional data Operational Excellence Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analyses. Stakeholder Engagement Working collaboratively across multiple sets of stakeholders – Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats Job Requirements Education Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience 3 - 4 years for Data Scientist Relevant working experience in a data science/advanced analytics role Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.) Statistical modelling using Analytical tools (R, Python, KNIME, etc.) Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference. Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.) Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.) Microsoft Office applications (MS Excel, etc.) #LI-DS1
Posted 2 weeks ago
2.0 years
0 Lacs
Mohali
On-site
Hi, Greetings from CS Soft Solutions (India) Pvt Ltd. We are looking for a highly skilled Node JS Developers to join our dynamic team. Minimum Experience Required: 2 years Required Skills: Strong proficiency in Node.js. Hands-on experience with AWS Lambda and Serverless architecture. Familiarity with API Gateway, DynamoDB, S3 , and other core AWS services. Proficient in designing and consuming RESTful APIs. Knowledge of CI/CD pipelines , preferably with AWS CodePipeline or similar tools. Ability to write clean, modular, and scalable code. Company Details: Company: CS Soft Solutions (I) Pvt Ltd (ISO 9001: 2015, ISO / IEC 27001:2013 & NASSCOM Certified) Address: CS Soft Solutions (I) Pvt Ltd i-18, Sector 101-A, IT City- SAS Nagar, Mohali. Industry: Software Services (Mobile, Web designing & Development) If you’re passionate and ready to take your career to the next level, we’d love to hear from you! Job Type: Full-time Benefits: Health insurance Leave encashment Paid sick time Provident Fund Work Location: In person
Posted 2 weeks ago
10.0 years
6 - 8 Lacs
Noida
On-site
Company Description We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 10+ years Extensive experience in back-end development utilizing Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Microservices Architecture. Strong working experience Strong experience in AWS (API Gateway, Fargate, S3, DynamoDB, SNS). Strong experience in SOAP and PostgreSQL. Hands-on experience with REST APIs, Caching system (e.g Redis) and messaging systems like Kafka etc. Proficiency in Service-Oriented Architecture (SOA) and Web Services (Apache CXF, JAX-WS, JAX-RS, SOAP, REST). Hands-on experience with multithreading, and cloud development. Strong working experience in Data Structures and Algorithms, Unit Testing, and Object-Oriented Programming (OOP) principles. Hands-on experience with relational databases such as SQL Server, Oracle, MySQL, and PostgreSQL. Experience with DevOps tools and technologies such as Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. Proficiency in build automation tools like Maven, Ant, and Gradle. Hands on experience on cloud technologies such as AWS/ Azure. Strong understanding of UML and design patterns. Ability to simplify solutions, optimize processes, and efficiently resolve escalated issues. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. Enthusiasm for learning new technologies and staying updated on industry trends RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France