Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a candidate applying for this position, you should be prepared to attend face-to-face interviews and be open to the opportunity. The role offers a 5-day work week based in the office. Your responsibilities will include architecting and implementing REST APIs, designing and implementing high-performance applications with low latency, integrating GenAI-powered capabilities into backend systems, testing software for responsiveness and efficiency, collaborating with front-end developers, and troubleshooting application-related issues. You will also be expected to take initiatives in building efficient solutions for scalability and work closely with the infrastructure team in triaging major incidents. Additionally, you should have experience in analyzing and researching solutions, as well as developing and implementing recommendations accordingly. To qualify for this position, you should hold a Bachelor's Degree in Computer Science or a related field, along with a minimum of 4 years of professional experience. Expertise in JavaScript (ES6) and Node JS is essential, as well as a strong understanding of algorithms and data structures. Proficiency in frameworks like Express JS and Restify is required, along with experience in building backend services for handling large-scale data. You should be able to architect high-availability applications and servers on the cloud following best practices, with a preference for microservices architecture. Experience with MySQL, NoSQL databases like DynamoDB and MongoDB, as well as GenAI agentic frameworks is advantageous. Additionally, you should have experience with complex SQL queries, Node JS concepts, technical deep-dives into code, and building and deploying GenAI-powered backend services or tools. Understanding communication through WebSockets, clean architecture, SOLID principles, AWS, Docker containerization, Python, AI-assisted development tools, and Git is beneficial for this role. If you meet these requirements and are ready to take on these responsibilities, we welcome your application for this exciting opportunity.,
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job description Schneider Electric is looking for AWS data cloud engineer with min experience of 5 years in AWS data lake implementation. Responsible for creating/managing data ingestion, transformation, making data ready for consumption in analytical layer of data lake. Also responsible for managing/monitoring data quality of data lake using informatica power center. Also responsible for creating dashboards from analytical layer of data lake using Tableau or Power BI. Your Role We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your responsibilities are: Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools. Implement data quality rules, perform data profiling to assess the source data quality, identify data anomalies, and create data quality scorecards using Informatica PowerCenter. Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making. Interact with product owners to understand the needs of data ingestion, data quality rules. Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes. Optional skill. Qualifications Your Skills and Experience Min of 3 to 5 years of experience in AWS Data Lake implementation. Min of 2 to 3 years of knowledge in Informatica PowerCenter. Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR, Amazon Athena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow. Understanding of relational databases like Oracle, SQL Server, MySQL Programming Skills: Strong experience with modern programming languages such as Python and Java. Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop. Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Expertise in developing Business Intelligence dashboards in Tableau, Power BI is a plus. Good knowledge on project and portfolio management suite of tools is a plus. Should be well versed with Agile principle of implementation. Having Safe Agile principles is a plus. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Help us deliver solutions that ensure Life Is On everywhere, for everyone and at every moment: https://youtu.be/NlLJMv1Y7Hk. Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. See what our people have to say about working for Schneider Electric: https://youtu.be/6D2Av1uUrzY Our EEO statement : Schneider Electric aspires to be the most inclusive and caring company in the world, by providing equitable opportunities to everyone, everywhere, and ensuring all employees feel uniquely valued and safe to contribute their best. We mirror the diversity of the communities in which we operate and we ‘embrace different’ as one of our core values. We believe our differences make us stronger as a company and as individuals and we are committed to championing inclusivity in everything we do. This extends to our Candidates and is embedded in our Hiring Practices. You can find out more about our commitment to Diversity, Equity and Inclusion here and our DEI Policy here Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling SCOT AIM team is seeking an exceptional Business Intelligence Engineer to join our innovative Inventory automation analytics team. This pioneering role will be instrumental in building and scaling analytics solutions that drive critical business decisions across inventory management, supply chain optimization and channel performance. You will work closely with Scientists, Product Managers, other Business Intelligence Engineers, and Supply Chain Managers to build scalable, high insight - high impact products and own improvements to business outcomes within your area, enabling WW and local solutions for retail Key job responsibilities Work with Product Managers to understand customer behaviors, spot system defects, and benchmark our ability to serve our customers, improving a wide range of internal products that impact selection decisions both nationally and regionally. • Design and develop end-to-end analytics solutions to monitor and optimize supply chain metrics, including and not limited to availability, placement, inventory efficiency and capacity planning & management at various business hierarchies. • Create interactive dashboards and automated reporting systems to enable deep-dive analysis of inventory performance across multiple dimensions (ASIN/GL/Sub-category/LOB/Brand level). • Build predictive models for seasonal demand forecasting and inventory planning, supporting critical business events and promotions. • Create scalable solutions for tracking deal inventory readiness for small events and channel share management. • Partner with category & business stakeholders to identify opportunities for process automation and innovation. A day in the life • Pioneering new analytical approaches and establishing best practices. • Building solutions from the ground up with significant autonomy. • Driving innovation in supply chain analytics through automation and advanced analytics. • Making a direct impact on business performance through data-driven decision making. About the team Have you ever ordered a product on Amazon and when that box with the smile arrived, wondered how it got to you so fast? Wondered where it came from and how much it cost Amazon? If so, Amazon’s Supply Chain Optimization Technology (SCOT) organization is for you. At SCOT, we solve deep technical problems and build innovative solutions in a fast-paced environment working with smart & passionate team members. (Learn more about SCOT: http://bit.ly/amazon-scot) Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience working directly with business stakeholders to translate between data and business needs Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
DESCRIPTION SCOT AIM team is seeking an exceptional Business Intelligence Engineer to join our innovative Inventory automation analytics team. This pioneering role will be instrumental in building and scaling analytics solutions that drive critical business decisions across inventory management, supply chain optimization and channel performance. You will work closely with Scientists, Product Managers, other Business Intelligence Engineers, and Supply Chain Managers to build scalable, high insight - high impact products and own improvements to business outcomes within your area, enabling WW and local solutions for retail Key job responsibilities Work with Product Managers to understand customer behaviors, spot system defects, and benchmark our ability to serve our customers, improving a wide range of internal products that impact selection decisions both nationally and regionally. Design and develop end-to-end analytics solutions to monitor and optimize supply chain metrics, including and not limited to availability, placement, inventory efficiency and capacity planning & management at various business hierarchies. Create interactive dashboards and automated reporting systems to enable deep-dive analysis of inventory performance across multiple dimensions (ASIN/GL/Sub-category/LOB/Brand level). Build predictive models for seasonal demand forecasting and inventory planning, supporting critical business events and promotions. Create scalable solutions for tracking deal inventory readiness for small events and channel share management. Partner with category & business stakeholders to identify opportunities for process automation and innovation. A day in the life Pioneering new analytical approaches and establishing best practices. Building solutions from the ground up with significant autonomy. Driving innovation in supply chain analytics through automation and advanced analytics. Making a direct impact on business performance through data-driven decision making. About the team Have you ever ordered a product on Amazon and when that box with the smile arrived, wondered how it got to you so fast? Wondered where it came from and how much it cost Amazon? If so, Amazon’s Supply Chain Optimization Technology (SCOT) organization is for you. At SCOT, we solve deep technical problems and build innovative solutions in a fast-paced environment working with smart & passionate team members. (Learn more about SCOT: http://bit.ly/amazon-scot) BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience working directly with business stakeholders to translate between data and business needs Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a member of Zendesk's engineering team in Australia, your main objective is to improve the customer experience by developing products that cater to over 170,000 global brands. These brands, including Discord, Calm, and Skyscanner, rely on Zendesk's solutions to ensure customer satisfaction on a daily basis. Working within a highly innovative and fast-paced environment, you will have the opportunity to collaborate with a diverse group of individuals from around the world, contributing to the success of some of Zendesk's most beloved products. This position is a hybrid role that combines remote work with on-site requirements, necessitating three days in the office and relocation to Pune. You will be part of a dynamic team focused on creating distributed, high-scale, and data-intensive integrations that enhance Zendesk's core SaaS product. Collaborating with other SaaS providers and cloud vendors such as Slack, Atlassian, and AWS, you will be involved in incorporating cutting-edge technologies and features to deliver top-notch solutions to customers. Your daily responsibilities will include designing, leading, and implementing customer-facing software projects, emphasizing the importance of good software practices and timely project delivery. Excellent communication skills, attention to detail, and the ability to influence others diplomatically are essential qualities for this role. Additionally, you will be expected to demonstrate leadership qualities, mentor team members, and consistently apply best practices throughout the development cycle. To excel in this role, you should have a solid background in Golang for high-volume applications, at least 2 years of experience in frontend development using JavaScript and React, and a strong focus on long-term solution viability. Experience in identifying and resolving reliability issues at scale, effective time management, and building integrations with popular SaaS products are also key requirements. The tech stack you will be working with includes Golang, JavaScript/TypeScript, React, Redux, React Testing Library, Cypress, Jest, AWS, Spinnaker, Kubernetes, Aurora/MySQL, DynamoDB, and S3. Please note that candidates must be physically located and willing to work from Karnataka or Maharashtra. In this hybrid role, you will have the opportunity to work both remotely and onsite, fostering connections, collaboration, and learning while maintaining a healthy work-life balance. Zendesk is committed to providing an inclusive and fulfilling environment for its employees, enabling them to thrive in a diverse and supportive workplace.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
As a highly skilled Backend Developer, you will utilize your expertise in Kotlin and Java to design, develop, and deploy scalable backend services and microservices for modern cloud-native applications. Your key responsibilities will include building RESTful APIs, deploying applications on AWS, containerizing services using Docker and Kubernetes, implementing monitoring solutions, and optimizing performance and reliability. You will be expected to work closely with frontend developers, DevOps engineers, and product managers to ensure seamless integration and functionality. Your strong programming experience in Kotlin and Java, along with knowledge of RESTful APIs, AWS services, Kubernetes, Docker, and CI/CD pipelines will be essential in this role. Additionally, familiarity with databases, software engineering best practices, and design patterns is required. Preferred skills such as experience with reactive programming, Infrastructure as Code using Terraform or CloudFormation, event-driven architectures, and knowledge of secure coding practices and application monitoring tools are a plus. With 6-8 years of experience in Java Development, including Core Java, Hibernate, J2EE, JSP, and Kotlin, you are well-equipped to excel in this position.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Position Senior Engineer - Data Engineer Job Description Position - Data Engineer What You Will Be Doing Design and development of real time software and Cloud/Web/mobile based software application. Analyze domain specific technical, high level or low level requirement and modification as per end customer or system requirement. Perform software testing including unit, functional and system level requirement including manual and automated Perform code review following coding guidelines and static code analysis & troubleshoots software problems of limited difficulty. Document technical deliverable like software specifications, design document, code commenting, test cases and test report, Release note etc. throughout the project life cycle. Develop software solutions from established programming languages or by learning new language required for specific project. What Are We Looking For Experience: 4 to 8 years in software/data engineering. Data Technologies: Proficiency in SQL, NoSQL databases (e.g., DynamoDB, MongoDB), ETL tools, and data warehousing solutions. Programming Languages: Proficiency in Python is a must. Cloud Platforms: Azure, AWS (e.g., EC2, S3, RDS) or GCP. Visualization Tools: Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Data Governance: Knowledge of data governance and security practices. CI/CD: Experience with DevOps practices, including CI/CD pipelines and containerization (Docker, Kubernetes). Communication Skills: Excellent verbal and written communication skills in English. Agile Methodologies: Experience working in Agile development environments. AI/ML Awareness: Understanding of AI and ML concepts, frameworks (e.g., TensorFlow, PyTorch), and practical applications.0 Generative AI Awareness: Familiarity with Generative AI technologies and their potential use cases. Location - Indore/ Ahmedabad Location: IN-GJ-Ahmedabad, India-Ognaj (eInfochips) Time Type Full time Job Category Engineering Services
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as a "Cloud Data Engineer", you should have experience with: Candidate must have experience on AWS Cloud technology for data processing and good understanding of AWS architecture. Candidate must have experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. Candidate must have experience with Storage and container services like ECS, S3, DynamoDB, RDS. Candidate must have experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. Candidate must have experience with Analytics services as Glue, Athena, Crawler, Lake Formation, Redshift. Candidate must have experience with Solution delivery for data processing components in larger End to End projects. Desirable Skillsets/ Good To Have AWS Certified professional. Experience on Data Processing on Databricks and unity catalog. Ability to drive the projects technically with right first deliveries within schedule and budget. Ability to collaborate across teams to deliver complex systems and components and manage stakeholder’s expectations well. Understands different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. Experienced with planning, estimating, organizing, and working on multiple projects. You may be assessed on the key critical skills relevant for success in role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology and as well as job-specific skillsets. This role will be based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology and as well as job-specific skillsets. To be successful as a Cloud Data Engineer ,you should have experience with: Candidate must have experience on AWS Cloud technology for data processing and good understanding of AWS architecture. Candidate must have experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. Candidate must have experience with Storage and container services like ECS, S3, DynamoDB, RDS. Candidate must have experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. Candidate must have experience with Analytics services as Glue, Athena, Crawler, Lake Formation, Redshift. Candidate must have experience with Solution delivery for data processing components in larger End to End projects. Desirable Skillsets/ Good To Have AWS Certified professional. Experience on Data Processing on Databricks and unity catalog. Ability to drive the projects technically with right first deliveries within schedule and budget. Ability to collaborate across teams to deliver complex systems and components and manage stakeholder’s expectations well. Understands different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. Experienced with planning, estimating, organising, and working on multiple projects. This role will be based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Position Senior Engineer - Data Engineer Job Description Position - Data Engineer What You Will Be Doing Design and development of real time software and Cloud/Web/mobile based software application. Analyze domain specific technical, high level or low level requirement and modification as per end customer or system requirement. Perform software testing including unit, functional and system level requirement including manual and automated Perform code review following coding guidelines and static code analysis & troubleshoots software problems of limited difficulty. Document technical deliverable like software specifications, design document, code commenting, test cases and test report, Release note etc. throughout the project life cycle. Develop software solutions from established programming languages or by learning new language required for specific project. What Are We Looking For Experience: 4 to 8 years in software/data engineering. Data Technologies: Proficiency in SQL, NoSQL databases (e.g., DynamoDB, MongoDB), ETL tools, and data warehousing solutions. Programming Languages: Proficiency in Python is a must. Cloud Platforms: Azure, AWS (e.g., EC2, S3, RDS) or GCP. Visualization Tools: Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Data Governance: Knowledge of data governance and security practices. CI/CD: Experience with DevOps practices, including CI/CD pipelines and containerization (Docker, Kubernetes). Communication Skills: Excellent verbal and written communication skills in English. Agile Methodologies: Experience working in Agile development environments. AI/ML Awareness: Understanding of AI and ML concepts, frameworks (e.g., TensorFlow, PyTorch), and practical applications.0 Generative AI Awareness: Familiarity with Generative AI technologies and their potential use cases. Location - Indore/ Ahmedabad Location: IN-GJ-Ahmedabad, India-Ognaj (eInfochips) Time Type Full time Job Category Engineering Services
Posted 1 week ago
7.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Position Senior/Lead Dotnet Engineer Job Description Job Description Strong knowledge of C#, .NET, .NET Core 8.0, ASP.NET Core, MVC, WebAPI, RESTful Services, Entity Framework, Strong in OOPS concepts Exception handling, collections, Data structure and algorithms Familiar with various design and architectural patterns and understanding fundamental design principles behind a scalable application [MUST for 7+ years] Good knowledge of Microsoft SQL Server or MySQL Skill for writing reusable C# libraries Excellent Analytical, problem-solving, and troubleshooting skills Working knowledge with source control systems like Git, BitBucket Nice to have Experience of following CI/CD and DevOps methodologies Good to have knowledge on UI Framework / Libraries like Angular Js / React Js / NextJS, JavaScript, TypeScript Experience in Test Driven Development (TDD) Good to have knowledge of NOSQL DB, DynamoDB Location: IN-GJ-Ahmedabad, India-Ognaj (eInfochips) Time Type Full time Job Category Engineering Services
Posted 1 week ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Position Senior Engineer-AWS DevOps Job Description Job Description Key Responsibilities Manage and optimize cloud infrastructure operations in AWS. Implement CI/CD pipelines and maintain infrastructure automation for seamless deployments. Proficiency in CI/CD tools such as Jenkins, GitHub Actions or Bitbucket Pipeline. Collaborate with DevOps, Application Development, and Security teams to resolve issues and drive platform enhancements. Support multi-account governance using AWS Organizations and ensure compliance with cloud security best practices . Utilize Infrastructure as Code (IaC) using Terraform and configuration management using Ansible. Strong expertise in AWS core services (EC2, S3, RDS, Lambda, CloudWatch, Config, Control Tower, DynamoDB, EKS). Knowledge of networking and security architectures (VNets, Firewalls, NATs, ACLs, Security Groups, Routing). Proficiency in programming languages such as Python, Go , and scripting languages ( Bash, PowerShell ). Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Assist in troubleshooting production environments , ensuring high availability and reliability. Develop dashboards using Grafana for cloud monitoring and performance visualization. Implement security measures throughout the development and deployment lifecycle. Develop and enforce best practices for infrastructure provisioning and configuration management. Required Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a DevOps Engineer or similar role in the IT industry. 6+ years of experience in cloud infrastructure engineering, with a strong focus on automation. 8+ years of hands-on experience with AWS. Location: IN-GJ-Ahmedabad, India-Ognaj (eInfochips) Time Type Full time Job Category Engineering Services
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a Senior Python Data Application Developer with a strong expertise in core Python and data-focused libraries. Your primary responsibility is to design, develop, and maintain data-driven applications optimized for performance and scalability. You will be building robust data pipelines, ETL processes, and APIs for integrating various data sources efficiently within the cloud environment. In this role, you will work on AWS using serverless and microservices architectures, utilizing services such as AWS Lambda, API Gateway, S3, DynamoDB, Kinesis, and other AWS tools as required. Collaboration with cross-functional teams is essential to deliver feature-rich applications that meet business requirements. You will apply software design principles and best practices to ensure applications are maintainable, modular, and highly testable. Your tasks will also involve setting up monitoring solutions to proactively monitor application performance, detect anomalies, and resolve issues. Optimizing data applications for cost, performance, and reliability on AWS is a crucial aspect of your role. To excel in this position, you should have at least 5 years of professional experience in data-focused application development using Python. Proficiency in core Python and data libraries such as Pandas, NumPy, and PySpark is required. You must possess a strong understanding of AWS services like ECS, Lambda, API Gateway, S3, DynamoDB, Kinesis, etc. Experience with building highly distributed and scalable solutions via serverless, micro-service, and service-oriented architecture is essential. Furthermore, you should be familiar with unit test frameworks, code quality tools, and CI/CD practices. Knowledge of database management, ORM concepts, and experience with both relational (PostgreSQL, MySQL) and NoSQL (DynamoDB) databases is desired. An understanding of the end-to-end software development lifecycle, Agile methodology, and AWS certification would be advantageous. Strong problem-solving abilities, attention to detail, critical thinking, and excellent communication skills are necessary for effective collaboration with technical and non-technical teams. Mentoring junior developers and contributing to a collaborative team environment are also part of your responsibilities. This is a full-time position located in Bangalore with a hybrid work schedule. If you have proficiency in Pandas, NumPy, and PySpark, along with 5 years of experience in Python, we encourage you to apply and join our team dedicated to developing, optimizing, and deploying scalable data applications supporting company growth and innovation.,
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Apply Before: 05-08-2025 Job Title: DevOps Engineer Job Location: Pune Employment Type: Full-Time Experience Required: 5 Years Salary: 20 LPA to 25 LPA Max Notice Period: Immediate Joiners We are looking for a DevOps Engineer with strong Kubernetes and cloud infrastructure experience to join our Pune-based team. The ideal candidate will play a key role in managing CI/CD pipelines, infrastructure automation, cloud resource optimization, and ensuring high availability and reliability of production systems. Required Skills Certified Kubernetes Administrator (CKA) mandatory Very good knowledge and operational experience with containerization and cluster management infrastructure setup and production environment maintenance (Kubernetes, vCluster, Docker, Helm) Very good knowledge and experience with high availability requirements (RTO and RPO) on cloud (AWS preferred with VPC, Subnet, ELB, Secrets manager, EBS Snapshots, EC2 Security groups, ECS, Cloudwatch and SQS) Very good knowledge and experience in administrating Linux, clients and servers Experience working with data storage, backup and disaster recovery using DynamoDB, RDS PostgreSQL and S3 Good experience and confidence with code versioning (Gitlab Preferred) Experience in automation with programming and IaC scripts (Python / Terraform) Experience with SSO setup and user management with Keycloak and / or Okta SSO Experience in service mesh monitoring setup with Istio, Kiali, Grafana, Loki and Prometheus Experience with GitOps setup and management for ArgoCD / FluxCD
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description As a Solution Architect, you will play a critical role in designing and governing modern, cloud-native, and AI-enabled solutions. You will work closely with global technology teams to review solution designs, raise architectural risks, and ensure alignment with enterprise standards. Your influence will help shape the future of our technology landscape. Key Responsibilities Lead architectural reviews and provide feedback on solution designs, requirements, and non-functional aspects. Collaborate with global teams to ensure architectural alignment, drive reuse and champion high-quality process and practice Contribute technical content and vision to the Mainframe and Data Centre exit Strategy, for the UKI BI (Business Information) business and support through governance Closely collaborate with colleagues in the other market verticals (Consumer, Marketing etc.) to ensure technical alignment of strategic cloud platform Contribute architecture guidance and input to the shaping and delivery of the set of initiatives which drive the Mainframe and Data Centre exit for the BI business Design scalable, secure, and resilient solutions using AWS services, and, where applicable, interim and hybrid architectures on the mainframe or in the Experian Data Centre. Guide teams in applying micro services, server-less, and Domain-Driven Design (DDD) principles. Provide technical leadership in the integration of AI and Large Language Models (LLMs) to assist, in particular, in code conversion and testing efforts. Develop and maintain architectural documentation including blueprints, patterns, and technical specifications. Stay current with emerging technologies and industry best practices. Mentor engineering teams and promote architectural excellence across the organisation. Qualifications Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Understanding of cloud-native architectures and how to refine and tune these to drive desired outcomes Proven experience as a Solution Architect with at least 5 years in enterprise solution design. Strong knowledge of / familiarity with mainframe technologies such as CICS, COBOL, VSAM, DB2, M204 Strong hands-on experience with AWS (e.g., Lambda, Glue, EKS, ECS, API Gateway, S3, DynamoDB, Document DB, RDS). Solid understanding of micro services, containers, server-less, and cloud architecture patterns. Experience working with LLMs and AI/ML frameworks. Strong influencing and communication skills to engage with global stakeholders. Knowledge of architectural frameworks (e.g., TOGAF, Zachman) is a plus. Experience with Enterprise Messaging Frameworks and message backbones would be advantageous Experience with Enterprise Search Engines would be advantageous Experience with Agile methodologies and CI/CD practices is advantageous. Solution Architect – (BI/Mainframe/AWS Focus) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are looking for a skilled and motivated Software Engineer who excels in Java, data structures and algorithms, REST APIs, and Spring Boot to join our growing team. This role is key to building scalable, enterprise-grade applications and contributing to our innovative software development processes. This is a fantastic opportunity to solve complex challenges and contribute to the building and growing of cutting-edge technologies within a dynamic and collaborative environment. Responsibilities Collaborate with product owners and architects on making design decisions Develop end user application with high scalability and high throughput Provide technical and design guidance and create standards Improve, optimize and identify opportunities for improved software development processes Contribute to designing and maintaining enterprise applications Requirements 4+ years of experience with Java and open-source Java frameworks like Spring Boot At least 3 years’ experience working with microservices and distributed computing-based architectures on a large scale Hands-on experience with Java 8+ and REST APIs Strong knowledge of Data Structures and Algorithms - Stacks, Queues, Linked Lists, Trees, Searching, Sorting, String Manipulation, Greedy Algorithms Background in core and enterprise design patterns, object-oriented programming, and distributed computing Knowledge of creating and integrating APIs using REST and SOAP protocols, familiarity with gRPC/Thrift frameworks Capability to work with AWS tools such as S3, Lambda, DynamoDB, and API Gateway Proficiency in resiliency patterns (throttling, circuit breakers, bulkheading), error handling, and monitoring tools like Grafana, Kibana, and Prometheus Familiarity with Git or similar version control tools and monorepositories Experience in automated testing including TDD, Unit/Functional/Integration testing Understanding of security mechanisms like OAuth2.0, TLS, and OWASP best practices Ability to communicate effectively in both written and spoken English with external and internal teams Technologies Java 8 & above DS Algo Microservices Spring Boot REST API Design Patterns, System Design AWS cloud #EasyApply
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description & Requirements Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen. The Server Engineer will report to the Senior Engineering Manager. Responsibilities Design, develop, and run a fast, scalable, highly available game service all the way from conception to delivery to live service operations Work with designers, client engineering, and production teams to achieve gameplay goals Implement security best practices and original techniques to keep user data secure and prevent cheating Create and run automated testing, readiness testing, and deployment plans Monitor the performance and costs of the server infrastructure to improve our game Design and implement data transformation layers using Java/Spring/AWS/Protobuf Collaborate with game server and web frontend teams to define API contracts Manage Release Ops / Live Ops of web services Qualifications We encourage you to apply if you can meet most of the requirements and are comfortable opening a dialogue to be considered. 2+ years development of scalable back-end services BS degree in Computer Science or equivalent work experience Proficiency in PHP, Java Experience with Cloud services like Amazon Web Services or Google Cloud Platform Experience with Redis Experience with Database Design and usage of large datasets in both relational (MySQL, Postgres) and NoSQL (Couchbase, DynamoDB) environments Experience defining API contracts and collaborating with cross-functional teams Bonus 3+ years of experience developing games using cloud services like AWS, Azure, Google Cloud Platform, or similar Proficient in technical planning, solution research, proposal, and implementation Background using metrics and analytics to determine the quality or priority Comfortable working across client and server codebases Familiar with profiling, optimising, and debugging scalable data systems Passion for making and playing games About Electronic Arts We’re proud to have an extensive portfolio of games and experiences, locations around the world, and opportunities across EA. We value adaptability, resilience, creativity, and curiosity. From leadership that brings out your potential, to creating space for learning and experimenting, we empower you to do great work and pursue opportunities for growth. We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more. We nurture environments where our teams can always bring their best to what they do. Electronic Arts is an equal opportunity employer. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. We will also consider employment qualified applicants with criminal records in accordance with applicable law. EA also makes workplace accommodations for qualified individuals with disabilities as required by applicable law.
Posted 1 week ago
0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
We’re looking for a Full Stack Developer to join our R&D team and play a pivotal role in building our next-generation enterprise analytics product. You'll collaborate closely with a world-class team of business consultants and engineers to tackle complex challenges using advanced data and analytics technologies. If you're passionate about delivering high-impact software in an agile environment and thrive on solving real-world problems, we want to hear from you. Your Responsibilities: Design, develop, and maintain scalable web applications using modern full stack technologies Ensure the quality and performance of applications through automated testing and CI/CD pipelines Participate actively in agile ceremonies (scrum, sprint planning, retrospectives) Write clean, maintainable, and efficient code while following best practices and standards Collaborate with cross-functional teams, mentoring junior developers and contributing to architectural decisions Drive solutions from idea to production, balancing speed with code quality and scalability Continuously explore and integrate new technologies and development practices Technical Skills & Experience Required: 3+ yeras of Experience working as a fullstack Developer. Backend: Proficiency in at least one backend language: Node.js, Python, Go, Java, C#, or others Frameworks such as Express, Spring Boot, .NET Core, FastAPI, etc API-first development using REST or GraphQL Experience with both SQL (PostgreSQL, SQL Server, MySQL) and NoSQL (MongoDB, DynamoDB) databases Exposure to event-driven architectures, message brokers (Kafka, RabbitMQ), or serverless backends Frontend: Hands-on experience with JavaScript-based SPAs using Vue.js, React, or Angular Solid understanding of HTML, CSS, SCSS, and responsive design principles Ability to translate About Us: We’re an international team who specialize in building technology products & then helping brands grow with multi-channel demand generation marketing. We have in-house experience working for Fortune companies, e-commerce brands, technology SaaS companies. We have assisted over a dozen billion dollar companies with consulting, technology, operations, and digital agency capabilities in managing their unique brand online. We have a fun and friendly work culture that also encourages employees personally and professionally. EbizON has many values that are important to our success as a company: integrity, creativity, innovation, mindfulness and teamwork. We thrive on the idea of making life better for people by providing them with peace of mind. The people here love what they do because everyone from management all way down understands how much it means living up close-to someones' ideals which allows every day feel less stressful knowing each person has somebody cheering him. Equal Opportunity Employer: EbizON is committed to providing equal opportunity for all employees, and we will consider any qualified applicant without regard to race or other prohibited characteristics. Flexible Timings: Flexible working hours are the new normal. We at EbizON believe giving employees freedom to choose when to work, how to work. It helps them thrive and also balance their life better. Global Clients Exposure: Our goal is to provide excellent customer service and we want our employees to work closely with clients from around the world. That's why you'll find us working closely with clients from around the world through Microsoft Teams, Zoom and other video conferencing tools. Retreats & Celebrations: With annual retreats, quarterly town halls and festive celebrations we have a lot of opportunities to get together. Powered by JazzHR MxuKNktehr
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description Roanuz is a dynamic company specializing in Sports Data and Ecommerce Consulting, with over 50 employees in three offices. Our flagship product, CricketAPI.com, is the world's leading cricket data provider, utilized by more than 75% of the top 30 fantasy apps. We have collaborated with industry giants like Amazon, Opera, Reliance Jio, and Zomato. We are dedicated to providing unparalleled expertise and excellent customer service to our clients. Role Description This is a full-time on-site role for a Full Stack Engineer, located in Chennai. The Full Stack Engineer will be responsible for both front-end and back-end web development tasks, including designing user interactions, developing servers, and databases for website functionality. Day-to-day tasks include writing clean and functional code on the front- and back-end, testing and fixing bugs or other coding issues, optimizing existing code, and collaborating with other team members to improve the development process. Qualifications 3+ years of relevant work experience in Python Bachelor's degree in Computer Science (or related field) Strong in OOPs concept Knowledge of dockers and containers is highly desirable Knowledge in any one of the Data Bases like MongoDB, Cassandra DynamoDB, MySql Hands on experience in AWS Services like EC2, Lambda, ECS Experience in working with microservices is an added advantage. Excellent communication skills Self-motivated and demonstrated team player skills Why should you join Roanuz? Working at Roanuz is about joining a team of innovative minds who are passionate about solving complex problems every day. As a Ruman, you will get to work in a fun, fast-paced, and supportive culture. Our striking success comes from our people, and investing in our team is our top priority! We're proud of the culture we've built over the last nine years. We're a team that fails fast but learns faster and finds joy in progress over perfection. Being a Ruman is about joining a team of smart people in an intensely fun environment! Flexible working hours. Why Roanuz? At Roanuz, you’ll join a passionate team of problem-solvers driven by innovation and real-world impact. Look into our website www.roanuz.com to know more about us!
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Dear Aspirants, Greetings from ValueLabs! We are hiring for Java developer Skill Set: Java, AWS Lambda, Microservices. Experience: 5+ Years NP: Immediate Location : Hyderabad - Hybrid Roles and Responsibilities: :• Minimum of 5+ years of total experience in Java development with strong experience working with AWS cloud services. • Proficient in Java, Spring Boot, Hibernate, and other popular Java frameworks. • Experience with AWS services such as Lambda, ECS, EKS, RDS, DynamoDB, and S3. • Knowledge of containerization technologies (Docker, Kubernetes). • Familiarity with CI/CD tools (Jenkins, GitLab CI). • Strong problem-solving skills and collaboration skills. • Excellent communication and collaboration skills. • Desired but not required: AWS Certified Solutions Architect, AWS Certified Developer Associate, and/or ISO 27001 certification. Job Description: • Minimum of 5+ years of total experience in Java development with strong experience working with AWS cloud services. • Proficient in Java, Spring Boot, Hibernate, and other popular Java frameworks. • Experience with AWS services such as Lambda, ECS, EKS, RDS, DynamoDB, and S3. • Knowledge of containerization technologies (Docker, Kubernetes). • Familiarity with CI/CD tools (Jenkins, GitLab CI). • Strong problem-solving skills and collaboration skills. • Excellent communication and collaboration skills. • Desired but not required: AWS Certified Solutions Architect, AWS Certified Developer Associate, and/or ISO 27001 certification
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description What We are looking for: We’re looking for a passionate and experienced Software Engineer to join our growing API & Exports team at Meltwater. This team is responsible for enabling programmatic access to data across the app—handling thousands of exports daily, improving API usability, and managing API integrations and performance at scale. You'll work on expanding and optimizing our export functionalities, building scalable APIs, and integrating robust monitoring and management. This is a high-impact team working at the core of our data delivery platform. What You'll Do: Own the design, development, and optimization of API and export features. Collaborate closely with product managers and senior engineers to define functionality and scale. Enhance developer experience by making APIs easier to consume and integrate. Participate in building robust export pipelines, streaming architectures, and webhook integrations. Maintain high observability and reliability standards using tools like Coralogix, CloudWatch, and Grafana. Participate in on-call rotations and incident response for owned services. What You'll Bring: 3+ years of software engineering experience with a strong focus on Golang (preferred), Java, or C++ Experience designing and developing RESTful APIs. Experience working with cloud-native applications (preferably AWS). Good understanding of microservice architecture and backend design principles. Solid knowledge of Postgres, Redis, and ideally DynamoDB. Nice To Have Familiarity with asynchronous or event-driven architectures using tools like SQS, SNS, or webhooks. Exposure to DevOps workflows and tools (Terraform, Docker, Kubernetes, etc.). Experience working with data exports, reporting systems, or data streaming. Experience improving developer experience around APIs (e.g., OpenAPI, Swagger, static site generators). Familiarity with JWT authentication, API gateways, and rate limiting strategies. Experience in accessibility and compliance standards for APIs and data handling. Experience with observability tools and practices. Our Tech Stack Languages: Golang, some JavaScript/TypeScript Infrastructure: AWS, S3, Lambda, SQS, SNS, CloudFront, Kubernetes (Helm), Kong Databases: Postgres, Redis, DynamoDB Monitoring: Coralogix, Grafana, OpenSearch CI/CD & IaC: GitHub Actions, Terraform What We Offer: Enjoy flexible paid time off options for enhanced work-life balance. Comprehensive health insurance tailored for you. Employee assistance programs cover mental health, legal, financial, wellness, and behavior areas to ensure your overall well-being. Complimentary CalmApp subscription for you and your loved ones, because mental wellness matters. Energetic work environment with a hybrid work style, providing the balance you need. Benefit from our family leave program, which grows with your tenure at Meltwater. Thrive within our inclusive community and seize ongoing professional development opportunities to elevate your career. Where You'll Work : Hitec city, Hyderabad. When You'll Join: As per the offer letter Our Story At Meltwater , we believe that when you have the right people in the right environment, great things happen. Our best-in-class technology empowers our 27,000 customers around the world to make better business decisions through data. But we can’t do that without our global team of developers, innovators, problem-solvers, and high-performers who embrace challenges and find new solutions for our customers. Our award-winning global culture drives everything we do and creates an environment where our employees can make an impact, learn every day, feel a sense of belonging, and celebrate each other’s successes along the way. We are innovators at the core who see the potential in people, ideas and technologies. Together, we challenge ourselves to go big, be bold, and build best-in-class solutions for our customers. We’re proud of our diverse team of 2,200+ employees in 50 locations across 25 countries around the world. No matter where you are, you’ll work with people who care about your success and get the support you need to unlock new heights in your career. We are Meltwater. Inspired by innovation, powered by people. Equal Employment Opportunity Statement Meltwater is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind: At Meltwater, we are dedicated to fostering an inclusive and diverse workplace where every employee feels valued, respected, and empowered. We are committed to the principle of equal employment opportunity and strive to provide a work environment that is free from discrimination and harassment. All employment decisions at Meltwater are made based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, veteran status, or any other status protected by the applicable laws and regulations. Meltwater does not tolerate discrimination or harassment of any kind, and we actively promote a culture of respect, fairness, and inclusivity. We encourage applicants of all backgrounds, experiences, and abilities to apply and join us in our mission to drive innovation and make a positive impact in the world.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Extensive experience in analytics and large scale data processing across diverse data platforms and tools Manage data storage and transformation across AWS S3, DynamoDB, Postgres, Delta Tables with efficient schema design and partitioning. Develop scalable analytics solutions using Athena and automate workflows with proper monitoring and error handling Ensure data quality access control and compliance through robust validation, logging and governance practices Design and maintain data pipelines using Python, Spark, Delta Lake framework, AWS Step functions, Event Bridge, AppFlow and OAUTH Tech Stack S3, Postgres, DynamoDB, Tableau, Python, Spark
Posted 1 week ago
5.0 years
4 - 9 Lacs
Hyderābād
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description As a Solution Architect, you will play a critical role in designing and governing modern, cloud-native, and AI-enabled solutions. You will work closely with global technology teams to review solution designs, raise architectural risks, and ensure alignment with enterprise standards. Your influence will help shape the future of our technology landscape. Key Responsibilities Lead architectural reviews and provide feedback on solution designs, requirements, and non-functional aspects. Collaborate with global teams to ensure architectural alignment, drive reuse and champion high-quality process and practice Contribute technical content and vision to the Mainframe and Data Centre exit Strategy, for the UKI BI (Business Information) business and support through governance Closely collaborate with colleagues in the other market verticals (Consumer, Marketing etc.) to ensure technical alignment of strategic cloud platform Contribute architecture guidance and input to the shaping and delivery of the set of initiatives which drive the Mainframe and Data Centre exit for the BI business Design scalable, secure, and resilient solutions using AWS services , and, where applicable, interim and hybrid architectures on the mainframe or in the Experian Data Centre. Guide teams in applying micro services, server-less, and Domain-Driven Design (DDD) principles. Provide technical leadership in the integration of AI and Large Language Models (LLMs) to assist, in particular, in code conversion and testing efforts. Develop and maintain architectural documentation including blueprints, patterns, and technical specifications. Stay current with emerging technologies and industry best practices. Mentor engineering teams and promote architectural excellence across the organisation. Qualifications Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Understanding of cloud-native architectures and how to refine and tune these to drive desired outcomes Proven experience as a Solution Architect with at least 5 years in enterprise solution design. Strong knowledge of / familiarity with mainframe technologies such as CICS, COBOL, VSAM, DB2 , M204 Strong hands-on experience with AWS (e.g., Lambda, Glue, EKS, ECS, API Gateway, S3, DynamoDB, Document DB, RDS). Solid understanding of micro services, containers, server-less, and cloud architecture patterns. Experience working with LLMs and AI/ML frameworks. Strong influencing and communication skills to engage with global stakeholders. Knowledge of architectural frameworks (e.g., TOGAF, Zachman) is a plus. Experience with Enterprise Messaging Frameworks and message backbones would be advantageous Experience with Enterprise Search Engines would be advantageous Experience with Agile methodologies and CI/CD practices is advantageous. Solution Architect – (BI/Mainframe/AWS Focus) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 week ago
6.0 years
15 - 20 Lacs
Hyderābād
Remote
Job Title: Node.js Developer Location: Hyderabad Job Type: Full-Time Exp: 6+ Years Relevant: 5+ Years About the Role We’re looking for a passionate and experienced Node.js Developer to join our growing team. The ideal candidate will have strong backend development expertise, experience working with serverless architectures, and a background in eCommerce platforms. If you're someone who thrives in a fast-paced, cloud-first environment and loves building scalable backend services, we’d love to hear from you. Key Responsibilities ● Develop, maintain, and enhance backend services using Node.js and TypeScript ● Design and implement serverless architectures on AWS (Lambda, API Gateway, DynamoDB, etc.) ● Work with MySQL databases, including schema design and query optimization ● Collaborate with cross-functional teams including frontend developers, designers, and product managers ● Implement and maintain CI/CD pipelines for seamless deployment and delivery ● Ensure scalability, performance, and security of backend services ● Troubleshoot and debug production issues quickly and efficiently ● Contribute to architecture decisions and code reviews Required Skills & Experience ● 3+ years of hands-on experience with Node.js ● Strong knowledge of TypeScript ● Experience building and deploying serverless applications on AWS ● Proficiency with MySQL or other relational databases ● Prior experience in eCommerce domain or platforms ● Familiarity with AWS services: Lambda, S3, API Gateway, CloudWatch, etc. ● Experience setting up and managing CI/CD pipelines (e.g., GitHub Actions, GitLab CI, CodePipeline) ● Understanding of secure coding practices and scalable backend patterns Nice to Have ● Knowledge of front-end frameworks (e.g., React, NextJS) for full-stack flexibility ● Experience with NoSQL databases (e.g., DynamoDB) ● Familiarity with containerization (e.g., Docker) Why Join Us? ● Work with a talented, collaborative team in a high-impact role ● Opportunity to work on modern, cloud-native tech stacks ● Flexible working hours and remote-friendly culture Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Schedule: Evening shift Monday to Friday Work Location: In person Application Deadline: 01/08/2025 Expected Start Date: 01/08/2025
Posted 1 week ago
4.0 years
4 - 9 Lacs
Gurgaon
On-site
About the Team: Join a highly skilled and collaborative team dedicated to ensuring data reliability, performance, and security across our organization’s critical systems. We work closely with developers, architects, and DevOps professionals to deliver seamless and scalable database solutions in a cloud-first environment, leveraging the latest in AWS and open-source technologies. Our team values continuous learning, innovation, and the proactive resolution of database challenges. About the Role: As a Database Administrator specializing in MySQL and Postgres within AWS environments, you will play a key role in architecting, deploying, and supporting the backbone of our data infrastructure. You’ll leverage your expertise to optimize database instances, manage large-scale deployments, and ensure our databases are secure, highly available, and resilient. This is an opportunity to collaborate across teams, stay ahead with emerging technologies, and contribute directly to our business success. Responsibilities: Design, implement, and maintain MySQL and Postgres database instances on AWS, including managing clustering and replication (MongoDB, Postgres solutions). Write, review, and optimize stored procedures, triggers, functions, and scripts for automated database management. Continuously tune, index, and scale database systems to maximize performance and handle rapid growth. Monitor database operations to ensure high availability, robust security, and optimal performance. Develop, execute, and test backup and disaster recovery strategies in line with company policies. Collaborate with development teams to design efficient and effective database schemas aligned with application needs. Troubleshoot and resolve database issues, implementing corrective actions to restore service and prevent recurrence. Enforce and evolve database security best practices, including access controls and compliance measures. Stay updated on new database technologies, AWS advancements, and industry best practices. Plan and perform database migrations across AWS regions or instances. Manage clustering, replication, installation, and sharding for MongoDB, Postgres, and related technologies. Requirements: 4-7 Years of Experinece in Database Management Systems as a Database Engineer. Proven experience as a MySQL/Postgres Database Administrator in high-availability, production environments. Expertise in AWS cloud services, especially EC2, RDS, Aurora, DynamoDB, S3, and Redshift. In-depth knowledge of DR (Disaster Recovery) setups, including active-active and active-passive master configurations. Hands-on experience with MySQL partitioning and AWS Redshift. Strong understanding of database architectures, replication, clustering, and backup strategies (including Postgres replication & backup). Advanced proficiency in optimizing and troubleshooting SQL queries; adept with performance tuning and monitoring tools. Familiarity with scripting languages such as Bash or Python for automation/maintenance. Experience with MongoDB, Postgres clustering, Cassandra, and related NoSQL or distributed database solutions. Ability to provide 24/7 support and participate in on-call rotation schedules. Excellent problem-solving, communication, and collaboration skills. What we offer? A positive, get-things-done workplace A dynamic, constantly evolving space (change is par for the course – important you are comfortable with this) An inclusive environment that ensures we listen to a diverse range of voices when making decisions. Ability to learn cutting edge concepts and innovation in an agile start-up environment with a global scale Access to 5000+ training courses accessible anytime/anywhere to support your growth and development (Corporate with top learning partners like Harvard, Coursera, Udacity) About us: At PayU, we are a global fintech investor and our vision is to build a world without financial borders where everyone can prosper. We give people in high growth markets the financial services and products they need to thrive. Our expertise in 18+ high-growth markets enables us to extend the reach of financial services. This drives everything we do, from investing in technology entrepreneurs to offering credit to underserved individuals, to helping merchants buy, sell, and operate online. Being part of Prosus, one of the largest technology investors in the world, gives us the presence and expertise to make a real impact. Find out more at www.payu.com Our Commitment to Building A Diverse and Inclusive Workforce As a global and multi-cultural organization with varied ethnicities thriving across locations, we realize that our responsibility towards fulfilling the D&I commitment is huge. Therefore, we continuously strive to create a diverse, inclusive, and safe environment, for all our people, communities, and customers. Our leaders are committed to create an inclusive work culture which enables transparency, flexibility, and unbiased attention to every PayUneer so they can succeed, irrespective of gender, color, or personal faith. An environment where every person feels they belong, that they are listened to, and where they are empowered to speak up. At PayU we have zero tolerance towards any form of prejudice whether a specific race, ethnicity, or of persons with disabilities, or the LGBTQ communities.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough