Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Greater Bengaluru Area
On-site
We're actively looking for a Java Developer to join our team for an exciting in-house fintech project. for Associate java Backend need to work from office all 5 days in Bangalore location and Sr Java Developer This is a Hybrid position , offering the flexibility to work from any of our ITC locations, including Bangalore, Hyderabad, Pune, Kolkata, or Gurgaon Designation: Associate Java Developer Experience: 3 - 5 Years Location: Bangalore (Need to work from office all 5 days) Designation: Sr Java Developer Experience: 5 - 8 Years Location: Bangalore/Hyderabad/Pune/Kolkata/Gurgaon (Hybrid 3 days work from office) Technical Skills: Core Java: Strong Java programming skills, including experience with Java 8 (and ideally familiarity with newer versions for "latest version" interpretation). Robust object-oriented design pattern knowledge and implementation experience. Strong understanding of data modeling techniques . Experience with multi-tier application architecture and high-performance distributed/in-memory caching solutions. Frameworks & APIs: Spring Boot REST API development and consumption. Apache POI (for working with Microsoft Office formats). Databases: Advanced knowledge and experience with relational databases like MySQL and Sybase . Testing: Extensive Unit Testing experience using JUnit 4+ (including Mockito, AssertJ ). Integration Testing experience. Familiarity with other testing frameworks like Cucumber, Jest, and Cypress is a plus. Tools & Methodologies: Maven (build automation). Git (version control: basic commands, branch creation, merging, etc.). SonarQube (code quality). Agile development methodologies. Strong foundation in SDLC best practices , including test-driven development, unit testing discipline, and CI/CD strategies. Professional & Soft Skills: Experience: Minimum of 5 years of practical software development experience. Problem Solving: Creativity and resourcefulness to problem solve independently. Coding Standards: Excellent coding practices and standards. Communication: Good communication and stakeholder management; strong overall communication. Passion & Drive: Passion for engineering highly available, performant systems; curiosity and drive to learn new things and build new solutions. Organization: Strong time management, organization, and attention to detail.
Posted 1 week ago
0 years
4 - 9 Lacs
Noida
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Welcome to the relentless pursuit of better. Inviting applications for the role of C onsultant – Java Full Stack In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity . Responsibilities Experience with Spring Boot Experience with Microservices development Extensive Experience working with JAVA Rest API. Extensive experience in Java 8- 1 7 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Experience in Angular 13+ & Rxjs . Ngrx will be an added advantage. Professional, precise communication skills Experience in API designing, troubleshooting, and tuning for performance Professional, precise communication skills Experience in designing, troubleshooting, API Java services and microservices Experience in any CI/CD tool. Experience in Apache Kafka will be added advantage. Qualifications we seek in you! Minimum Q ualifications BE / B.Tech / M.Tech /MCA Excellent Communication Skills Good Team Player Preferred qualifications Experience with Spring Boot Experience with Microservices development Extensive Experience working with JAVA Rest API. Extensive experience in Java 8- 1 7 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Experience in Angular 13+ & Rxjs . Ngrx will be an added advantage. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 28, 2025, 6:13:42 AM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time
Posted 1 week ago
5.0 years
14 - 21 Lacs
Noida
On-site
Position: Technical Lead Experience Required: 5 Years Location: Noida Sector 62 About the Role: As a Technical Lead at Benthon Labs Pvt Ltd you'll build fully-fledged platforms using a range of different technologies. You'll be involved in the entire product development lifecycle including the design, development, deployment and maintenance of new and existing features. You'll write clean and functional code on the front- and back-end. You'll collaborate closely with our development team to ensure system consistency and to create great user experience. You'll write reusable and maintainable code. You'll optimize web design for mobile for maximum speed. Ultimately, your work will have a direct impact on the stability and user experience of our products. Key Requirements: 1. Minimum 5 years of development experience with back-end and front-end technologies. 2. Comfortable in working with both front-end back-end languages. 3. Having strong knowledge of back-end programming languages such as Node.js and JavaScript frameworks (like Angular, React, and Vue). 4. Having good knowledge of multiple front-end languages and libraries (like HTML, CSS and JavaScript). 5. Familiarity with databases (like MySQL and MongoDB), web servers (e.g. Apache) and cloud services like AWS, Azure, GCP etc. 6. Experience with testing and debugging is hands-on. 7. Analytical and good at time management. 8. Have great communication and problem-solving skills. 9. Curious about new technologies and you're excited to find ways to implement them in your work. 10. Have experience with coaching and mentoring other developers or a team of developers. Must-Have Skills: 1. 5 years of development experience in front-end and back-end technologies. 2. Strong expertise in Node.js for back-end development. 3. Proficiency in JavaScript frameworks (React.js, Angular, or Vue.js). 4. Familiarity with databases (MySQL, MongoDB) and web servers (Apache, Nginx). Hands-on experience with cloud services (AWS, Azure, or GCP). 5. Ability to mentor junior developers and drive technical projects forward. Job Types: Full-time, Permanent Pay: ₹1,400,000.00 - ₹2,100,000.00 per year Benefits: Flexible schedule Food provided Experience: Node.js: 3 years (Required) AWS: 1 year (Required) Team Lead: 1 year (Required) Work Location: In person
Posted 1 week ago
8.0 years
3 - 4 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Develop comprehensive digital analytics solutions utilizing Adobe Analytics for web tracking, measurement, and insight generation Design, manage, and optimize interactive dashboards and reports using Power BI to support business decision-making Lead the design, development, and maintenance of robust ETL/ELT pipelines integrating diverse data sources Architect scalable data solutions leveraging Python for automation, scripting, and engineering tasks Oversee workflow orchestration using Apache Airflow to ensure timely and reliable data processing Provide leadership and develop robust forecasting models to support sales and marketing strategies Develop advanced SQL queries for data extraction, manipulation, analysis, and database management Implement best practices in data modeling and transformation using Snowflake and DBT; exposure to Cosmos DB is a plus Ensure code quality through version control best practices using GitHub Collaborate with cross-functional teams to understand business requirements and translate them into actionable analytics solutions Stay updated with the latest trends in digital analytics; familiarity or hands-on experience with Adobe Experience Platform (AEP) / Customer Journey Analytics (CJO) is highly desirable Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Master’s or Bachelor’s degree in Computer Science, Information Systems, Engineering, Mathematics, Statistics, Business Analytics, or a related field 8+ years of progressive experience in digital analytics, data analytics or business intelligence roles Experience with data modeling and transformation using tools such as DBT and Snowflake; familiarity with Cosmos DB is a plus Experience developing forecasting models and conducting predictive analytics to drive business strategy Advanced proficiency in web and digital analytics platforms (Adobe Analytics) Proficiency in ETL/ELT pipeline development and workflow orchestration (Apache Airflow) Skilled in creating interactive dashboards and reports using Power BI or similar BI tools Deep understanding of digital marketing metrics, KPIs, attribution models, and customer journey analysis Industry certifications relevant to digital analytics or cloud data platforms Ability to deliver clear digital reporting and actionable insights to stakeholders at all organizational levels At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NJP
Posted 1 week ago
2.0 years
0 - 0 Lacs
Vaishali Nagar, Jaipur, Rajasthan
On-site
Job Title: AI Developer Company: Eoxys IT Solution Location: Jaipur, Rajasthan Experience: 1–2 Years Employment Type: Full-Time Education Qualification: BCA / MCA / B.Tech in Computer Science, IT, or related field Key Skills Required: Strong programming skills in Python Hands-on experience with TensorFlow , PyTorch , Keras Experience building and deploying end-to-end ML pipelines Solid understanding of model evaluation , cross-validation , and hyperparameter tuning Familiarity with cloud platforms such as AWS, Azure, or GCP for AI/ML workloads Knowledge of MLOps tools like MLflow, DVC, or Apache Airflow Exposure to domains like Natural Language Processing (NLP) , Computer Vision , or Reinforcement Learning Roles & Responsibilities: Develop, train, and deploy machine learning models for real-world applications Implement scalable ML solutions using cloud platforms Collaborate with cross-functional teams to integrate AI capabilities into products Monitor model performance and conduct regular improvements Maintain version control and reproducibility using MLOps practices Additional Requirements: Strong analytical and problem-solving skills Passion for learning and implementing cutting-edge AI/ML technologies Good communication and teamwork skills Salary: Based on experience and skillset Apply Now to be a part of our innovative AI journey! Job Type: Full-time Pay: ₹15,000.00 - ₹40,000.00 per month Work Location: In person
Posted 1 week ago
2.0 years
3 - 4 Lacs
Calcutta
On-site
Job Overview: RJS Tech Solution LLP is hiring a Linux / Web Server Administrator to manage and maintain our Linux servers and web hosting infrastructure. The role is full-time and requires working from our Rashbehari, Kolkata office . Key Responsibilities: Manage and maintain Linux-based web servers (Apache, Nginx). Handle server security, backups, SSL, and performance tuning. Set up and troubleshoot domains, emails, DNS, and databases (MySQL/PostgreSQL). Perform regular system updates and automate tasks using shell scripts. Requirements: 2+ years of Linux server administration experience. Hands-on with web hosting, cPanel/Webmin, and server security tools. Basic scripting knowledge (Bash/Shell). Must be willing to work full-time from our Kolkata office. Why Join RJS Tech Solution LLP? Competitive salary: ₹3 – ₹4 LPA Growth-oriented work environment with learning opportunities Collaborative and supportive work culture Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹400,000.00 per year Schedule: Day shift Fixed shift Application Question(s): What is your current and expected CTC? Are you comfortable working full-time from our Rashbehari (Kolkata) office? How many years of experience do you have in managing Linux-based servers? Work Location: In person
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities: Architect, build, and maintain Kafka producers and consumers , including Kafka Streams and Connect pipelines for real-time streaming systems. Develop Java (Java 8+) microservices using Spring Boot , working with REST APIs and backend services . Operate and configure Kafka clusters —topics, partitions, brokers, ZooKeeper/KRaft, Schema Registry. Troubleshoot messaging issues, optimize throughput and latency, and ensure high availability and security. Work with SQL (e.g. MySQL/Postgres) and NoSQL databases (e.g. MongoDB) for backend integration. Integrate microservices and message pipelines within Docker , Kubernetes/ECS , and CI/CD environments. Eligility: 5–8 years hands-on experience in Java-based application development Strong expertise with Apache Kafka —producers, consumers, stream processing, configuration and administration. Solid experience building Spring Boot microservices and RESTful APIs. Proficiency in SQL and NoSQL databases (e.g. Oracle, MySQL, MongoDB). Familiarity with Kafka ecosystem components: Connect, Streams, Schema Registry. Linux/Unix proficiency and shell scripting experience. Experience with containerization (Docker), orchestration tools (Kubernetes, ECS). Familiarity with CI/CD tools (Jenkins, GitLab CI, Azure DevOps). Strong communication, analytical, and problem-solving skills
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Purview team is dedicated to protecting and governing the enterprise digital estate on a global scale. Our mission involves developing cloud solutions that offer premium features such as security, compliance, data governance, data loss prevention and insider risk management. These solutions are fully integrated across Office 365 services and clients, as well as Windows. We create global-scale services to transport, store, secure, and manage some of the most sensitive data on the planet, leveraging Azure, Exchange, and other cloud platforms, along with Office applications like Outlook. The IDC arm of our team is expanding significantly and seeks talented, highly motivated engineers. This is an excellent opportunity for those looking to build expertise in cloud distributed systems, security, and compliance. Our team will develop cloud solutions that meet the demands of a vast user base, utilizing state-of-the-art technologies to deliver comprehensive protection. Office 365, the industry leader in hosted productivity suites, is the fastest-growing business at Microsoft, with over 100 million seats hosted in multiple data centers worldwide. The Purview Engineering team provides leadership, direction, and accountability for application architecture, cloud design, infrastructure development, and end-to-end implementation. You will independently determine and develop architectural approaches and infrastructure solutions, conduct business reviews, and operate our production services. Strong collaboration skills are essential to work closely with other engineering teams, ensuring our services and systems are highly stable, performant, and meet the expectations of both internal and external customers and users. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Build cloud-scale services that process and analyze massive volumes of organizational signals in real time. Harness the power of Apache Spark for high-performance data processing and scalable pipelines. Apply machine learning to uncover subtle patterns and anomalies that signal insider threats. Craft intelligent user experiences using React and AI-driven insights to help security analysts act with confidence. Work with a modern tech stack and contribute to a product that’s mission-critical for some of the world’s largest organizations. Collaborate across disciplines—from data science to UX to cloud infrastructure—in a fast-paced, high-impact environment. Design and deliver end-to-end features including system architecture, coding, deployment, scalability, performance, and quality. Develop large-scale distributed software services and solutions that are modular, secure, reliable, diagnosable, and reusable. Conduct investigations and drive investments in complex technical areas to improve systems and services. Ensure engineering excellence by writing effective code, unit tests, debugging, code reviews, and building CI/CD pipelines. Troubleshoot and optimize Live Site operations, focusing on automation, reliability, and monitoring. Qualifications Qualifications - Required: Solid understanding of Object-Oriented Programming (OOP) and common Design Patterns. Minimum of 4+ years of software development experience, with proficiency in C#, Java, or scala. Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud; experience with Azure Services is a plus. Familiarity with DevOps practices, CI/CD pipelines, and agile methodologies. Strong skills in distributed systems and data processing. Excellent communication and collaboration abilities, with the capacity to handle ambiguity and prioritize effectively. A BS or MS degree in Computer Science or Engineering, or equivalent work experience. Qualifications - Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft background and Microsoft Cloud background check upon hire/transfer and every two years thereafter. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 week ago
0.0 - 4.0 years
0 - 0 Lacs
Sahibzada Ajit Singh Nagar, Mohali, Punjab
On-site
Job Title: Python Backend Developer (Data Layer) Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a skilled Python Backend Developer with a strong passion and proven expertise in database design and implementation. This role requires 3-4 years of backend development experience, focusing on building robust, scalable applications and APIs. The ideal candidate will not only be proficient in Python and common backend frameworks but will possess significant experience in designing, modeling, and optimizing various database solutions, including relational databases (like PostgreSQL) and, crucially, graph databases (specifically Neo4j). You will play a vital role in architecting the data layer of our applications, ensuring efficiency, scalability, and the ability to handle complex, interconnected data. Key Responsibilities: ● Design, develop, test, deploy, and maintain scalable and performant Python-based backend services and APIs. ● Lead the design and implementation of database schemas for relational (e.g., PostgreSQL) and NoSQL databases, with a strong emphasis on Graph Databases (Neo4j). ● Model complex data relationships and structures effectively, particularly leveraging graph data modeling principles where appropriate. ● Write efficient, optimized database queries (SQL, Cypher, potentially others). ● Develop and maintain data models, ensuring data integrity, consistency, and security. ● Optimize database performance through indexing strategies, query tuning, caching mechanisms, and schema adjustments. ● Collaborate closely with product managers, frontend developers, and other stakeholders to understand data requirements and translate them into effective database designs. ● Implement data migration strategies and scripts as needed. ● Integrate various databases seamlessly with Python backend services using ORMs (like SQLAlchemy, Django ORM) or native drivers. ● Write unit and integration tests, particularly focusing on data access and manipulation logic. ● Contribute to architectural decisions, especially concerning data storage, retrieval, and processing. ● Stay current with best practices in database technologies, Python development, and backend systems. Minimum Qualifications: ● Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field, OR equivalent practical experience. ● 3-4 years of professional software development experience with a primary focus on Python backend development. ● Strong proficiency in Python and its standard libraries. ● Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). ● Demonstrable, hands-on experience designing, implementing, and managing relational databases (e.g., PostgreSQL). ● Experience with at least one NoSQL database (e.g., MongoDB, Redis, Cassandra). ● Solid understanding of data structures, algorithms, and object-oriented programming principles. ● Experience designing and consuming RESTful APIs. ● Proficiency with version control systems, particularly Git. ● Strong analytical and problem-solving skills, especially concerning data modeling and querying. ● Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: ● Graph Database Expertise: ○ Significant, demonstrable experience designing and implementing solutions using Graph Databases (Neo4j strongly preferred). ○ Proficiency in graph query languages, particularly Cypher. ○ Strong understanding of graph data modeling principles, use cases (e.g., recommendation engines, fraud detection, knowledge graphs, network analysis), and trade-offs. ● Advanced Database Skills: ○ Experience with database performance tuning and monitoring tools. ○ Experience with Object-Relational Mappers (ORMs) like SQLAlchemy or Django ORM in depth. ○ Experience implementing data migration strategies for large datasets. ● Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and their managed database services (e.g., RDS, Aurora, Neptune, DocumentDB, MemoryStore). ● Containerization & Orchestration: Experience with Docker and Kubernetes. ● Asynchronous Programming: Experience with Python's asyncio and async frameworks. ● Data Pipelines: Familiarity with ETL processes or data pipeline tools (e.g., Apache Airflow). ● Testing: Experience writing tests specifically for database interactions and data integrity. What We Offer: ● Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. ● Competitive salary and comprehensive benefits package. ● Opportunities for professional development and learning (e.g., conferences, courses, certifications). ● A collaborative, innovative, and supportive work environment. How to Apply: Interested candidates are invited to submit their resume and a cover letter outlining their relevant experience, specifically highlighting their database design expertise (including relational, NoSQL, and especially Graph DB/Neo4j experience) Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹55,373.94 per month Benefits: Food provided Health insurance Schedule: Day shift Monday to Friday
Posted 1 week ago
4.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Data Engineer (Python) As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We are currently seeking a seasoned Data Engineer with a good experience in Python to join our team of professionals. Key Responsibilities: Develop Data Lake tables leveraging AWS Glue and Spark for efficient data management. Implement data pipelines using Airflow, Kubernetes, and various AWS services Must Have Skills: Experience in deploying and managing data warehouses Advanced proficiency of at least 4 years in Python for data analysis and organization Solid understanding of AWS cloud services Proficient in using Apache Spark for large-scale data processing Skills and Qualifications Needed: Practical experience with Apache Airflow for workflow orchestration Demonstrated ability in designing, building, and optimizing ETL processes, data pipelines, and data architectures Flexible, self-motivated approach with strong commitment to problem resolution. Excellent written and oral communication skills, with the ability to deliver complex information in a clear and effective manner to a range of different audiences. Willingness to work globally and across different cultures, and to participate in all stages of the data solution delivery lifecycle, including pre-studies, design, development, testing, deployment, and support. Nice to have exposure to Apache Druid Familiarity with relational database systems, Desired Work Experience : A degree in computer science or a similar field What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio Join the team as our next Senior Machine Learning Engineer (L3) in our Comms Platform Engineering team About The Job This position is needed to scope, design, and deploy machine learning systems into the real world, the individual will closely partner with Product & Engineering teams to execute the roadmap for Twilio’s AI/ML products and services. Twilio is looking for a Senior Machine Learning engineer to join the rapidly growing Comms Platform Engineering team of our Messaging business unit. You will understand the needs of our customers and build data products that solve their needs at a global scale. Working side by side with other engineering teams and product counterparts, you will own end-to-end execution of ML solutions. To thrive in this role, you must have a background in ML engineering, and a track record of solving data & machine-learning problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization Responsibilities In this role, you’ll: Build and maintain scalable machine learning solutions in production Train and validate both deep learning-based and statistical-based models considering use-case, complexity, performance, and robustness Demonstrate end-to-end understanding of applications and develop a deep understanding of the “why” behind our models & systems Partner with product managers, tech leads, and stakeholders to analyze business problems, clarify requirements and define the scope of the systems needed Work closely with data platform teams to build robust scalable batch and realtime data pipelines Work closely with software engineers, build tools to enhance productivity and to ship and maintain ML models Drive engineering best practices around code reviews, automated testing and monitoring Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 5+ years of applied ML experience. Proficiency in Python is preferred. We will also consider strong quantitative candidates with a background in other programming languages Strong background in the foundations of machine learning and building blocks of modern deep learning Track record of building, shipping and maintaining machine learning models in production in an ambiguous and fast paced environment. You have a clear understanding of frameworks like - PyTorch, TensorFlow, or Keras, why and how these frameworks do what they do Familiarity with ML Ops concepts related to testing and maintaining models in production such as testing, retraining, and monitoring. Demonstrated ability to ramp up, understand, and operate effectively in new application / business domains. You’ve explored some of the modern data storage, messaging, and processing tools (Kafka, Apache Spark, Hadoop, Presto, DynamoDB etc.) Experience working in an agile team environment with changing priorities Experience of working on AWS Desired Experience with Large Language Models Location This role will be remote, and based in India (only in Karnataka, TamilNadu, Maharashtra, Telangana and New Delhi). Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.
Posted 1 week ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Responsibilities · Experience with Spring Boot · Experience with Microservices development · Extensive Experience working with JAVA Rest API. · Extensive experience in Java 8-17 SE · Experience with unit testing frameworks Junit or Mockito · Experience with Maven/Gradle · Experience in Angular 16+ & Rxjs. NgRx is mandatory along with experience with unit testing using Jest/Jasmine · Professional, precise communication skills · Experience in API designing, troubleshooting, and tuning for performance · Professional, precise communication skills · Experience in designing, troubleshooting, API Java services and microservices · Experience in any CI/CD tool. · Experience in Apache Kafka will be added advantage. Qualifications we seek in you! Minimum Qualifications · BE /B.Tech/M.Tech/MCA · Excellent Communication Skills · Good Team Player Preferred qualifications · Experience with Spring Boot · Experience with Microservices development · Extensive Experience working with JAVA Rest API. · Extensive experience in Java 8-17 SE · Experience with unit testing frameworks Junit or Mockito · Experience with Maven/Gradle Experience in Angular 13+ & Rxjs. Ngrx will be an added advantage
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Java Full stack developer Minimum Experience: 6+ years Location - Pune Required Skills: Strong proficiency in ReactJS and TypeScript Strong expertise in Java , Spring Framework , Spring Boot , and RESTful APIs Experience with PostgreSQL and Apache Kafka development Solid understanding of CI/CD pipelines , with hands-on experience using tools such as Chef , Jenkins , SonarQube , Checkmarx , Maven , and Gradle Proficient in Low-Level System Design Skilled in code reviews and maintaining code quality Ability to independently handle complex and challenging tasks Excellent communication skills and team collaboration Why Work at Apexon? We care about your growth, health, and happiness. Here are some perks you'll enjoy: Health Insurance (covers you and your family) Paid Leaves and Holidays Hybrid Work Culture Career Development & Learning Programs Wellness Support Programs Rapidly Growing Company Among the fastest-growing digital engineering firms Tech-Forward - Cutting-edge work in AI, ML, automation, and cloud Extensive learning and upskilling opportunities Award-Winning Workplace Festivals, milestones & team celebrations Hackathons, wellness activities, R&R, employee spotlights About Apexon Apexon is a digital-first technology company helping businesses grow through innovation and smarter digital solutions. We work with clients at every step of their digital journey using tools like AI, data, cloud, apps, and user experience design to create powerful digital products. Visit- www.apexon.com | https://www.linkedin.com/company/apexon/
Posted 1 week ago
10.0 - 14.0 years
20 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
Greetings from Infogain! We are having Immediate requirement for Big Data Engineer (Lead) position in Infogain India Pvt ltd. As a Big Data Engineer (Lead), you will be responsible for leading a team of big data engineers. You will work closely with clients and team members to understand their requirements and develop architectures that meet their needs. You will also be responsible for providing technical leadership and guidance to your team. Mode of Hiring-Permanent Skills : (Azure OR AWS) AND Apache Spark OR Hive OR Hadoop AND Spark Streaming OR Apache Flink OR Kafka AND NoSQL AND Shell OR Python. Exp: 10 to 14 years Location: Bangalore/Noida/Gurgaon/Pune/Mumbai/Kochi Notice period- Early joiner Educational Qualification: BE/BTech/MCA/M.tech Working Experience 12-15 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Experience in building streaming data pipeline using Apache Spark Structured Streaming or Apache Flink on Kafka & Delta Lake Knowledge of NOSQL databases. Good to have experience in Cosmos DB, Restful APIs and GraphQL Knowledge of Big data ETL processing tools, Data modelling and Data mapping. Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Can share CV @ arti.sharma@infogain.com Total Exp Experience- Relevant Experience in Big data Relevant Exp in AWS OR Azure Cloud- Current CTC- Exp CTC- Current location - Ok for Bangalore location-
Posted 1 week ago
6.0 years
0 Lacs
Sanganer, Rajasthan, India
On-site
Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In This Role, You Will Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment.
Posted 1 week ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 1 week ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
JD - Data Engineer Pattern values data and the engineering required to take full advantage of it. As a Data Engineer at Pattern, you will be working on business problems that have a huge impact on how the company maintains its competitive edge. Essential Duties And Responsibilities Develop, deploy, and support real-time, automated, scalable data streams from a variety of sources into the data lake or data warehouse. Develop and implement data auditing strategies and processes to ensure data quality; identify and resolve problems associated with large-scale data processing workflows; implement technical solutions to maintain data pipeline processes and troubleshoot failures. Collaborate with technology teams and partners to specify data requirements and provide access to data. Tune application and query performance using profiling tools and SQL or other relevant query languages. Understand business, operations, and analytics requirements for data Build data expertise and own data quality for assigned areas of ownership Work with data infrastructure to triage issues and drive to resolution Required Qualifications Bachelor’s Degree in Data Science, Data Analytics, Information Management, Computer Science, Information Technology, related field, or equivalent professional experience Overall experience should be more than 4 + years 3+ years of experience working with SQL 3+ years of experience in implementing modern data architecture-based data warehouses 2+ years of experience working with data warehouses such as Redshift, BigQuery, or Snowflake and understand data architecture design Excellent software engineering and scripting knowledge Strong communication skills (both in presentation and comprehension) along with the aptitude for thought leadership in data management and analytics Expertise with data systems working with massive data sets from various data sources Ability to lead a team of Data Engineers Preferred Qualifications Experience working with time series databases Advanced knowledge of SQL, including the ability to write stored procedures, triggers, analytic/windowing functions, and tuning Advanced knowledge of Snowflake, including the ability to write and orchestrate streams and tasks Background in Big Data, non-relational databases, Machine Learning and Data Mining Experience with cloud-based technologies including SNS, SQS, SES, S3, Lambda, and Glue Experience with modern data platforms like Redshift, Cassandra, DynamoDB, Apache Airflow, Spark, or ElasticSearch Expertise in Data Quality and Data Governance Our Core Values Data Fanatics: Our edge is always found in the data Partner Obsessed: We are obsessed with partner success Team of Doers: We have a bias for action Game Changers: We encourage innovation About Pattern Pattern is the premier partner for global e-commerce acceleration and is headquartered in Utah's Silicon Slopes tech hub—with offices in Asia, Australia, Europe, the Middle East, and North America. Valued at $2 billion, Pattern has been named one of the fastest-growing tech companies in North America by Deloitte and one of the best-led companies in America by Inc. More than 100 global brands—like Nestle, Sylvania, Kong, Panasonic, and Sorel —rely on Pattern's global e-commerce acceleration platform to scale their business around the world. We place employee experience at the center of our business model and have been recognized as one of America's Most Loved Workplaces®. https://pattern.com/
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
Pune
Work from Office
APPLICATION DEVELOPER More than 4 years experience in Discovery / ITOM domain on ServiceNow platform.Very good implementation knowledge of ServiceNow Discovery, Service Mapping, Certificate Management, AI OPSGood understanding & hands on and AWS & Azure cloud Good to have knowledge and working experience of other ITOM modules like Event Management & Cloud Managment Good knowledge on configuring / troubleshooting skills on IT Infrastructure, Server, Network, Storage, Cloud and worked as a System / Platform administrator Good knowledge of middleware solutions such as IIS, WebLogic, Websphere, Apache Tomcat, etc Worked in a Datacenter environment as L3 / L4 support Technical knowledge of the following areasJava, HTML, JavaScript, LDAP/Active Directory Working knowledge of configuration management, CMDB, methods and processes Excellent verbal and written communication skills, including polished presentation skills with the ability to deliver technical issues to both technical and non-technical audiences in a clear and understandable manner. Provide excellent customer service, leadership, communication, problem solving and decision-making skills. Nice to haveBasic knowledge and awareness of various other modules like ITBM and security Must have flare for research, innovation and must be passionate to work on innovation topics .
Posted 1 week ago
0.0 - 3.0 years
0 - 0 Lacs
Chandigarh, Chandigarh
On-site
Preferred candidate profile We are looking for a Linux Administrator at least 3 years of experience who will be responsible for installation and configuration of webserver and database server. The ideal candidate should have knowledge of deployment of websites. who will be responsible for designing, implementing, and monitoring the infrastructure and knowledge of docker ,CI/CD. 1.In depth knowledge of Linux: RedHat, CentOS, Debian, etc. 2.Solid knowledge of installation and configuration of Webserver(Nginx or Apache), database server(MySQL, PostgreSQL, MongoDB). 3. knowledge about the cloud service like AWS, Azure, Digitalocean. 4. knowledge about the networking like Switches, router, firewall. 5. Knowledge of docker, CI/CD and terraform 6 Knowledge of deployments of website in different language like PHP, NodeJS, python on production server 7. Experience in deployment of Webserver, PHP, Node and Python Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Food provided Health insurance Life insurance Paid sick time Paid time off
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Immediate #HIRING for a highly motivated and experienced GCP Data Engineer to join our growing team. We’re a leading software company specializing in Artificial Intelligence, Machine Learning, Data Analytics, Innovative data solutions, Cloud-based technologies If you're passionate about building robust applications and thrive in a dynamic environment, please share your resume a t rizwana@randomtrees.com Job Title: GCP Data Engineer Experience : 4 Yrs - 8Yrs Notice: Immediate Location: Hyderabad/ Chennai - Hybrid Mode Job Type: Full-time Employment Job Description: We are looking for an experienced GCP Data Engineer to design, develop, and optimize data pipelines and solutions on Google Cloud Platform (GCP) . The ideal candidate should have hands-on experience with BigQuery, DataFlow, PySpark, GCS, and Airflow (Cloud Composer) , along with strong expertise or knowledge in DBT. Key Responsibilities: Design and develop scalable ETL/ELT data pipelines using DataFlow (Apache Beam), PySpark, and Airflow (Cloud Composer) . Work extensively with BigQuery for data transformation, storage, and analytics. Implement data ingestion, processing, and transformation workflows using GCP-native services. Optimize and troubleshoot performance issues in BigQuery and DataFlow pipelines. Manage data storage and governance using Google Cloud Storage (GCS) and other GCP services. Ensure data quality, security, and compliance with industry standards. Work closely with data scientists, analysts, and business teams to provide data solutions. Automate workflows, monitor jobs, and improve pipeline efficiency. Required Skills: ✔ Google Cloud Platform (GCP) Data Engineering (GCP DE Certification preferred) DBT knowledge or experience is mandate ✔ BigQuery – Data modeling, query optimization, and performance tuning ✔ PySpark – Data processing and transformation ✔ GCS (Google Cloud Storage) – Data storage and management ✔ Airflow / Cloud Composer – Workflow orchestration and scheduling ✔ SQL & Python – Strong hands-on experience ✔ Experience with CI/CD pipelines, Terraform, or Infrastructure as Code (IaC) is a plus.
Posted 1 week ago
4.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
We are seeking a skilled Golang Developer with 4+ years of hands-on software development experience. The ideal candidate will possess strong Go programming capabilities, deep knowledge of Linux internals, and experience working with service-oriented and microservice architectures. Key Responsibilities: 4+ years of Software development experience Good Go implementation capabilities. Understanding of different design principles. Good understanding of Linux OS - memory, instruction processing, filesystem, system daemons etc. Fluent with linux command line and shell scripting. Working knowledge of servers (nginx, apache, etc.), proxy-servers, and load balancing. Understanding of service based architecture and microservices. Knowledge of AV codecs, MpegTS and adaptive streaming like Dash, HLS. Good understanding of computer networking concepts. Working knowledge of relational Databases Good analytical and debugging skills. Knowledge of git or any other source code management Good to Have Skills: Working knowledge of Core Java and Python are preferred. Exposure to cloud computing is preferred. Exposure to API or video streaming performance testing is preferred. Preferred experience in Elasticsearch and Kibana (ELK Stack) Proficiency in at least one modern web front-end development framework such as React JS will be a bonus Preferred experience with messaging systems like RabbitMQ
Posted 1 week ago
175.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Expertise with handling large volumes of data coming from many different disparate systems Expertise with Core Java , multithreading , backend processing , transforming large data volumes Working knowledge of Apache Flink , Apache Airflow , Apache Beam, open source data processing platforms Working knowledge of cloud platforms like GCP. Working knowledge of databases and performance tuning for complex big data scenarios - Singlestore DB and In Memory Processing Cloud Deployments , CI/CD and Platform Resiliency Good experience with Mvel Excellent communication skills , collaboration mindset and ability to work through unknowns Work with key stakeholders to drive data solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field is required. 3+ years of large-scale technology engineering and formal management in a complex environment and/or comparable experience. To be successful in this role you will need to be good in Java, Flink, SQ, KafkaL & GCP Successful engineering and deployment of enterprise-grade technology products in an Agile environment. Large scale software product engineering experience with contemporary tools and delivery methods (i.e. DevOps, CD/CI, Agile, etc.). 3+ years' experience in a hands-on engineering in Java and data/distributed eco-system. Ability to see the big picture with attention given to critical details. Preferred Qualifications: Knowledge on Kafka, Spark Finance domain knowledge We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France