Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
3 - 4 Lacs
Ahmedabad
On-site
About the Role: We are looking for a skilled and detail-oriented QA Engineer with over 2 years of experience in manual, automation, performance, and security testing . You will work closely with developers, product managers, and DevOps teams to ensure high-quality, secure, and scalable software products. This role is ideal for someone who is passionate about software quality and eager to take ownership of test planning and execution across functional and non-functional requirements. Key Responsibilities: Design and execute test cases for functional, regression, and integration testing. Develop and maintain automated test scripts using tools such as Selenium/TestNG. Conduct performance testing using tools like JMeter, LoadRunner, or similar. Perform basic security testing (e.g., input validation, authentication/authorization checks, session handling). Validate REST APIs and backend logic using tools such as Postman or Swagger. Document defects clearly and follow up with the development team until resolution. Analyze test results, identify patterns, and suggest improvements for stability and performance. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Information Technology, or equivalent. 2+ years of experience in Quality Assurance, with exposure to both manual and automated testing. Hands-on experience in performance testing tools such as Apache JMeter, BlazeMeter, or LoadRunner. Familiarity with security testing concepts , OWASP Top 10, and tools like Burp Suite (basic level). Proficient in bug tracking tools (e.g., Jira). Understanding of API testing using Postman or similar tools. Basic understanding of SQL and database testing. Strong problem-solving, documentation, and communication skills. Job Type: Full-time Pay: ₹30,000.00 - ₹40,000.00 per month Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Experience: Performance testing: 2 years (Required) Location: Ahmedabad, Gujarat (Required) Work Location: In person Speak with the employer +91 8160197141
Posted 4 days ago
2.0 years
3 - 4 Lacs
Ahmedabad
On-site
About the Role: We are looking for a skilled and detail-oriented QA Engineer with over 2 years of experience in manual, automation, performance, and security testing . You will work closely with developers, product managers, and DevOps teams to ensure high-quality, secure, and scalable software products. This role is ideal for someone who is passionate about software quality and eager to take ownership of test planning and execution across functional and non-functional requirements. Key Responsibilities: Design and execute test cases for functional, regression, and integration testing. Develop and maintain automated test scripts using tools such as Selenium/TestNG. Conduct performance testing using tools like JMeter, LoadRunner, or similar. Perform basic security testing (e.g., input validation, authentication/authorization checks, session handling). Validate REST APIs and backend logic using tools such as Postman or Swagger. Document defects clearly and follow up with the development team until resolution. Analyze test results, identify patterns, and suggest improvements for stability and performance. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Information Technology, or equivalent. 2+ years of experience in Quality Assurance, with exposure to both manual and automated testing. Hands-on experience in performance testing tools such as Apache JMeter, BlazeMeter, or LoadRunner. Familiarity with security testing concepts , OWASP Top 10, and tools like Burp Suite (basic level). Proficient in bug tracking tools (e.g., Jira). Understanding of API testing using Postman or similar tools. Basic understanding of SQL and database testing. Strong problem-solving, documentation, and communication skills. Job Type: Full-time Pay: ₹30,000.00 - ₹40,000.00 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Monday to Friday Experience: Performance testing: 2 years (Required) Location: Ahmedabad, Gujarat (Required) Work Location: In person Speak with the employer +91 8160197141
Posted 4 days ago
7.0 - 10.0 years
0 Lacs
Noida
On-site
About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last. We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way. We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage. Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company. We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative. The successful candidate will be responsible for the design and development of scalable software, code refactoring and maintaining the high quality of the core modules. Candidate should be well versed with cloud-based architectures/frameworks/APIs on the front and back-end. Should be able to work as an IC as well as able to manage small team of 3-5 developers. Candidate should have hands on working experience in GCP platform and should be well versed with managed services like Kubernetes, Cloud Function, Pub/Sub, CloudSQl etc. Job Responsibilities Work with development teams, Architect and product managers to ideate software solutions Able to design small to moderate server-side modules Develop and manage well-functioning databases and applications Write effective Data centric Java based APIs Ensure responsiveness and efficiency of server-side modules Able to articulate solution approach and able to do the documentation Support customer issues Communicate and help the team to achieve goals Competence Requirements Must Have Candidate should have worked and developed microservices in any one Cloud Platform. GCP is preferred. Candidate should be above average and have sound experience in working with mySQL or other relational DBs and should be able to write complex but optimized queries Should have sound working experience in Java Spring Boot framework Should have sound working experience in following tools: Jenkins Docker Debugging via docker pods Working with Bigquery Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery) Knowledge of multiple back-end languages (Java, Python) and JavaScript frameworks (e.g. Angular, React, Node.js) Experience in bigdata stack like Kafka/Apache Beam/PubSub Experience on Docker, Kubernetes Working knowledge of RDMS and NoSQL Excellent communication and teamwork skills Good to Have Working with Github Copilot will be an added advantage Great attention to detail Analytical mindset and Organizational skills EXPERIENCE : 7 to 10 years Qualified as B.E / B.Tech / M.E / M.Tech / M.C.A / PGDCA in Computer related subjects. Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer. RXVImMZkHW
Posted 4 days ago
10.0 years
0 Lacs
India
On-site
Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Data Engineer Locations- Kochi/Chennai/Coimbatore/Mumbai/Pune/Hyderabad Job Overview : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will have deep expertise in Azure Databricks and Python, and experience building scalable data pipelines. Familiarity with Data Fabric architectures is a plus. You’ll work closely with data scientists, analysts, and business stakeholders to deliver robust data solutions that drive insights and innovation. Key Responsibilities Design, build, and maintain large-scale, distributed data pipelines using Azure Databricks and Py Spark. Design, build, and maintain large-scale, distributed data pipelines using Azure Data Factory Develop and optimize data workflows and ETL processes in Azure Cloud environments. Write clean, maintainable, and efficient code in Python for data engineering tasks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Monitor and troubleshoot data pipelines for performance and reliability issues. Implement data quality checks, validations, and ensure data lineage and governance. Contribute to the design and implementation of a Data Fabric architecture (desirable). Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5–10 years of experience in data engineering or related roles. Expertise in Azure Databricks, Delta Lake, and Spark. Strong proficiency in Python, especially in a data processing context. Experience with Azure Data Lake, Azure Data Factory, and related Azure services. Hands-on experience in building data ingestion and transformation pipelines. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Good To Have Experience or understanding of Data Fabric concepts (e.g., data virtualization, unified data access, metadata-driven architectures). Knowledge of modern data warehousing and lakehouse principles. Exposure to tools like Apache Airflow, dbt, or similar. Experience working in agile/scrum environments. DP-500 and DP-600 Certifications What We Offer Competitive salary and performance-based bonuses. Flexible work arrangements. Opportunities for continuous learning and career growth. A collaborative, inclusive, and innovative work culture. www.orioninc.com (21) Orion Innovation: Company Page Admin | LinkedIn Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC And Its Subsidiaries And Its Affiliates (collectively, “Orion,” “we” Or “us”) Are Committed To Protecting Your Privacy. This Candidate Privacy Policy (orioninc.com) (“Notice”) Explains What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.
Posted 4 days ago
0 years
0 Lacs
India
Remote
Company Description Intellect IT is based on a "100% onshore (United States)" model, providing all service offerings within the United States either onsite at customer locations or at our development center in California. For more details, please check out our company's website: www.intellectit.com. Role Description This is a contract remote role for a Java Developer specializing in Google Cloud Platform (GCP) and Apache Beam. The Java Developer will be responsible for software development tasks, including the design, implementation, and maintenance of complex systems. Day-to-day tasks include developing microservices, programming using the Spring Framework, and ensuring system performance and reliability. Qualifications Proficiency in Software Development and Programming Experience with Java and the Spring Framework Knowledge of Microservices architecture Excellent problem-solving and analytical skills Good communication and teamwork abilities Experience with cloud platforms, particularly Google Cloud Platform, is a plus Bachelor's degree in Computer Science, Engineering, or related field
Posted 4 days ago
0 years
7 - 9 Lacs
Noida
On-site
City/Cities Noida Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 29-Jul-2025 Job ID 10373 Description and Requirements Must have knowledge on Java, OOPs concepts, Selenium Web Driver, Maven, TestNG, Page Factory/POM, Extent reporting, Apache POI and Error Handling. Must have working experience in agile model . Good knowledge of Azure DevOps ( Board, Repository, Test Plan ). Good to have knowledge on CI/CD pipeline creation for automation script in Azure DevOps . Should be able to write test scenarios & test cases for automation prospective. Must have hands on experience on functional, Integration, System, UAT and Regression Testing . Must have good understanding of different automation and testing frameworks ( Data Driven with Test NG is preferable). Must have good understanding on Defect Life Cycle. Should have good understanding on SDLC and STLC. Should be able to set up reporting with Selenium Web Driver ( Extent report is preferable). Should be able to lead team. Knowledge of Defect Management process . Good in verbal and written communication . Experience in client communication . About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 4 days ago
3.0 years
1 - 5 Lacs
Calcutta
On-site
Software Developer (Asp.net) – Detailed Role Description Job brief We are looking for a Senior Software Developer specializing in .Net to build software using languages and technologies of the .NET framework. You should be a pro with third-party API integration and application programming. In this role, you should be able to write smooth & functional code with a sharp eye for spotting defects. You should be a team player and an excellent communicator. If you are also passionate about the .NET framework and software design/architecture, we’d like to meet you. Your goal will be to work with internal teams to design, develop and maintain software. Responsibilities Participate in requirements analysis. Work in a development team to develop integrated ASP.NET applications. Write clean, scalable code using ASP.NET Framework, programming language (C#), and Rest API. Write SQL Server Queries and normalize SQL table structure. Revise, update, refactor and debug code. Improve existing software Develop documentation throughout the software development life cycle (SDLC) Serve as an expert on applications and provide technical support Requirements Required at least 3 years of Software development using asp.net, MVC, C#, web application forms, API Integrations. Hands on experience in SQL Server, and design/architectural patterns for .NET Framework Web Application) Experienced in Bootstrap, jQuery, HTML, CSS3 and XML Experienced with architecture styles/APIs (REST, Web API, Json) Excellent troubleshooting skills Excellent English communication skills to be able to work with a global team. (This is Mandatory) BSc/Btech/BCA in Computer Science, Engineering, or a related field Must Have Skill Set: Asp.net (Core) C# SQL/NoSQL (Microsoft SQL, PostgreSQL, SQLite etc) Modern frontend frameworks (Blazor, React etc) Third Party SOAP and Rest API Integrations HTML & CSS JavaScript jQuery Bootstrap Knowledge of standard unit testing tools such as Jenkins Good to have skill set: .NET MVC .NET MAUI (Xamarin) Experience with CRM development Experience in the ISP, Telephony and MSP industries Experience with Apache HTTP & Nginx Experience with Debian & Debian based Linux server distributions (e.g Ubuntu) Other Details: Shift Timings: 1:15 to 10:30pm – Monday to Friday 1:15 to 6:30pm on Alternate Saturdays Work Mode: Fulltime & Onsite. Drop Facilities provided Medical Insurance cover for you and your family Free Café facilities Our Brands: https://v4consumer.co.uk https://v4one.co.uk
Posted 4 days ago
8.0 years
0 Lacs
India
Remote
Are you seeking Backend expertise? Here's what you should look for: - Proficiency in Java 11 or higher - Experience with Spring Boot 2.7 and above - Strong foundation in Oracle DB fundamentals - Skills in JDBC, Hibernate, HQL, and SQL - Familiarity with Data Formats: JSON, XML, YAML - Knowledge of Web Technologies like REST APIs and JSON - Proficiency in Version Control tools such as Git, Bitbucket, SourceTree, and Git Bash - Experience with JIRA, Confluence, and Jenkins CI/CD Tools - Familiarity with Maven, Gradle, Tomcat, and Postman - Comfort with Eclipse IDE and IntelliJ IDE - Optional expertise in Messaging Protocols like AMQP 0-9-1, AMQP 1.0 - Optional knowledge of Message Brokers: RabbitMQ, Apache ActiveMQ, Azure Service Bus - Optional understanding of Integration Patterns: Publisher/Subscriber, Point-to-Point - Optional experience with Enterprise Messaging: Message Queuing, Event-Driven Architecture Requirements: - Minimum of 8 years of experience - Fully remote position - Coding interview is mandatory - Installation of IntelliJ on the laptop is required Ready to elevate your Backend skills? hashtag#BackendDevelopment hashtag#Java hashtag#SpringBoot hashtag#OracleDB hashtag#JDBC hashtag#Hibernate hashtag#SQL hashtag#RESTAPIs hashtag#JSON hashtag#IntelliJIDE hashtag#CodingInterview Please send your resume to aditi.duvvri@appitsoftware.com
Posted 4 days ago
15.0 years
0 Lacs
Calcutta
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Apache Spark Good to have skills : Java, Scala, PySpark Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing requirements, proposing solutions, and ensuring that the data platform aligns with organizational goals and standards. Your role will require you to stay updated with industry trends and best practices to contribute effectively to the team. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay abreast of emerging technologies and methodologies. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with Java, Scala, PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration tools and techniques. - Familiarity with cloud platforms and services related to data engineering. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Kolkata office. - A 15 years full time education is required. 15 years full time education
Posted 4 days ago
2.0 years
3 - 6 Lacs
Jaipur
On-site
Job Overview: We are seeking a skilled and passionate Flutter Developer who also has strong backend development experience in Laravel, PHP, and CodeIgniter . The ideal candidate will be responsible for building and maintaining cross-platform mobile applications, managing backend services, handling API integrations , and deploying applications on web hosting platforms and VPS servers . Key Responsibilities: Develop robust, scalable, and high-performing mobile apps using Flutter for Android and iOS. Collaborate with frontend and backend teams to integrate RESTful APIs. Maintain and enhance backend systems using Laravel, PHP, and CodeIgniter . Ensure smooth deployment and hosting of applications on web hosting services and VPS environments . Troubleshoot, debug, and upgrade software as needed. Ensure security and data protection measures are implemented properly. Required Skills & Qualifications:- Strong hands-on experience with Flutter and Dart . Proficient in Laravel, PHP, and CodeIgniter for backend development. Solid knowledge of API integration , including RESTful APIs and third-party services. Experience with Web Hosting & VPS deployment (e.g., cPanel, Plesk, Apache/Nginx, DigitalOcean, AWS EC2, etc.). Knowledge of database management (MySQL, Firebase, etc.). Familiarity with Git or any version control system. Good understanding of mobile design principles and user experience (UX) guidelines. Preferred Qualifications: Experience with Firebase , Push Notifications , and in-app purchases . Understanding of State Management in Flutter (e.g., Provider, BLoC). Exposure to CI/CD pipelines for mobile apps. Familiarity with server security practices and SSL setup. Job Type: Full-time Pay: ₹25,000.00 - ₹50,000.00 per month Schedule: Morning shift Application Question(s): Which backend frameworks have you worked with?(Laravel CodeIgniter ,Core PHP) Have you ever integrated a RESTful API in a Flutter application? Which of the following best describes your experience with web hosting platforms? Used shared hosting (e.g., Hostinger, Bluehost) Used VPS (e.g., DigitalOcean, Linode) Used both shared hosting and VPS Experience: Flutter: 2 years (Preferred) VPS hosting and web hosting: 1 year (Preferred) PhP, Laravel & CodeIgniter: 1 year (Preferred) Work Location: In person
Posted 4 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Mandatory : Proficiency in Python with experience in Databricks (PySpark) Good to Have : Hands-on experience with Apache Airflow. Working knowledge of PostgreSQL, MongoDB. Basic experience on cloud technologies like Azure, AWS and Google.
Posted 4 days ago
1.0 - 2.0 years
1 - 4 Lacs
India
On-site
Job Title: AI Developer Company: Eoxys IT Solution Location: Jaipur, Rajasthan Experience: 1–2 Years Employment Type: Full-Time Education Qualification: BCA / MCA / B.Tech in Computer Science, IT, or related field Key Skills Required: Strong programming skills in Python Hands-on experience with TensorFlow , PyTorch , Keras Experience building and deploying end-to-end ML pipelines Solid understanding of model evaluation , cross-validation , and hyperparameter tuning Familiarity with cloud platforms such as AWS, Azure, or GCP for AI/ML workloads Knowledge of MLOps tools like MLflow, DVC, or Apache Airflow Exposure to domains like Natural Language Processing (NLP) , Computer Vision , or Reinforcement Learning Roles & Responsibilities: Develop, train, and deploy machine learning models for real-world applications Implement scalable ML solutions using cloud platforms Collaborate with cross-functional teams to integrate AI capabilities into products Monitor model performance and conduct regular improvements Maintain version control and reproducibility using MLOps practices Additional Requirements: Strong analytical and problem-solving skills Passion for learning and implementing cutting-edge AI/ML technologies Good communication and teamwork skills Salary: Based on experience and skillset Apply Now to be a part of our innovative AI journey! Job Type: Full-time Pay: ₹15,000.00 - ₹40,000.00 per month Work Location: In person
Posted 4 days ago
7.0 years
0 Lacs
Andhra Pradesh
On-site
P1-C1-STS JD 7+ years experience as Java Dev Strong expertise in Spring Boot, Spring Framework, Spring Data, and Spring Cloud. Experience with Apache Fuse (Red Hat Fuse) or Apache Camel. Experience with RESTful API design Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI), Docker, Kubernetes. Transform and enrich current APIs or create new APIs Familiarity with Agile/Scrum methodologies. Java ,Fuse and SpringBoot Advanced- More than 4 year of project experience REST APIs Intermediate - Having at least 1 year of project experience Messaging/ Event Driven Architecture Microservices Kubernetes / OCP API Gateway ADO Develop, test, and deploy robust Java-based microservices using Spring Boot. Build and maintain integration solutions using Apache Fuse (based on Apache Camel). Work with RESTful and SOAP-based services, including API development and consumption. Design and implement enterprise integration patterns and data transformation logic. Collaborate with DevOps for CI/CD, containerization (Docker/Kubernetes), and production deployments. Mandatory Skills Sprint Boot Fuse Microservices REST API Kubernetes About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Junior Full Stack Developer (Java) at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Junior Full Stack Developer (Java) you should have experience with: Proficiency in Java 3+ with programming experience, reading, writing and debugging multi-threaded code, Rest Services. Proven ability to work in a team environment with experience of the full Software Development Lifecycle Demonstrable understanding of Java, J2EE, Spring Framework and JDBC. Working knowledge of Rest Services / Microservices Working knowledge of CI and unit test frameworks. Working knowledge of ORM technologies like Hibernate & Spring Data/JPA Working knowledge of tools like Java Profilers and analyzing memory dumps. Working knowledge of messaging platforms such as MQ and Solace and related design patterns for producing and consuming messages. Working knowledge of XML/JSON and related technologies. Working knowledge of SQL and database technologies such as MS SQL Server, Oracle, Mongo DB Experience working in an AGILE or SCRUM SDLC model Some Other Highly Valued Skills May Include Knowledge of Apache Kafka, Docker, Kubernetes, No SQL – MongoDB, React, Angular Familiar with DevOps fundamentals practices Proven experience of Quality Assurance techniques relevant to application development. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Analyst Expectations To meet the needs of stakeholders/ customers through specialist advice and support Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. Likely to have responsibility for specific processes within a team They may lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. They supervise a team, allocate work requirements and coordinate team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they manage own workload, take responsibility for the implementation of systems and processes within own work area and participate on projects broader than direct team. Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. Check work of colleagues within team to meet internal and stakeholder requirements. Provide specialist advice and support pertaining to own work area. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise. Make judgements based on practise and previous experience. Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements. Build relationships with stakeholders/ customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 4 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Enterprise Data & Analytics - Jr Java Developer 3+ years of Java development 1+ Apache Camel Good knowledge of SQL Knowledge of UNIX Good understanding of agile development methodologies Good communication skills Additional experience in following areas would be great to have Kafka, Python, OCP containers, AWS EC2, Snowflake EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 4 days ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA
Posted 4 days ago
8.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Manager - Data Integration Architect – Medidata Platform Integration EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for an experienced Data Integration Architect with 8+ years in clinical or life sciences domains to lead the integration of Medidata platforms into enterprise clinical trial systems. This role offers the chance to design scalable, compliant data integration solutions, collaborate across global R&D systems, and contribute to data-driven innovation in the healthcare and life sciences space. You will play a key role in aligning integration efforts with organizational architecture and compliance standards while engaging with stakeholders to ensure successful project delivery. Your Key Responsibilities Design and implement scalable integration solutions for large-scale clinical trial systems involving Medidata platforms. Ensure integration solutions comply with regulatory standards such as GxP and CSV. Establish and maintain seamless system-to-system data exchange using middleware platforms (e.g., Apache Kafka, Informatica) or direct API interactions. Collaborate with cross-functional business and IT teams to gather integration requirements and translate them into technical specifications. Align integration strategies with enterprise architecture and data governance frameworks. Provide support to program management through data analysis, integration status reporting, and risk assessment contributions. Interface with global stakeholders to ensure smooth integration delivery and resolve technical challenges. Mentor junior team members and contribute to knowledge sharing and internal learning initiatives. Participate in architectural reviews and provide recommendations for continuous improvement and innovation in integration approaches. Support business development efforts by contributing to solution proposals, proof of concepts (POCs), and client presentations. Skills And Attributes For Success Use a solution-driven approach to design and implement compliant integration strategies for clinical data platforms like Medidata. Strong communication, stakeholder engagement, and documentation skills, with experience presenting complex integration concepts clearly. Proven ability to manage system-to-system data flows using APIs or middleware, ensuring alignment with enterprise architecture and regulatory standards To qualify for the role, you must have Experience: Minimum 8 years in data integration or architecture roles, with a strong preference for experience in clinical research or life sciences domains. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Hands-on expertise in one or more integration platforms such as Apache Kafka, Informatica, or similar middleware technologies; experience in implementing API-based integrations. Domain Knowledge: In-depth understanding of clinical trial data workflows, integration strategies, and regulatory frameworks including GxP and CSV compliance. Soft Skills: Strong analytical thinking, effective communication, and stakeholder management skills with the ability to collaborate across business and technical teams. Additional Attributes: Ability to work independently in a fast-paced environment, lead integration initiatives, and contribute to solution design and architecture discussions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks. Familiarity with broader clinical R&D platforms such as Oracle Clinical, RAVE, or other EDC systems. Prior experience leading small integration teams and working directly with cross-functional stakeholders in regulated environments What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and Consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
12.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.
Posted 4 days ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Manager - Data Integration Architect – Medidata Platform Integration EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for an experienced Data Integration Architect with 8+ years in clinical or life sciences domains to lead the integration of Medidata platforms into enterprise clinical trial systems. This role offers the chance to design scalable, compliant data integration solutions, collaborate across global R&D systems, and contribute to data-driven innovation in the healthcare and life sciences space. You will play a key role in aligning integration efforts with organizational architecture and compliance standards while engaging with stakeholders to ensure successful project delivery. Your Key Responsibilities Design and implement scalable integration solutions for large-scale clinical trial systems involving Medidata platforms. Ensure integration solutions comply with regulatory standards such as GxP and CSV. Establish and maintain seamless system-to-system data exchange using middleware platforms (e.g., Apache Kafka, Informatica) or direct API interactions. Collaborate with cross-functional business and IT teams to gather integration requirements and translate them into technical specifications. Align integration strategies with enterprise architecture and data governance frameworks. Provide support to program management through data analysis, integration status reporting, and risk assessment contributions. Interface with global stakeholders to ensure smooth integration delivery and resolve technical challenges. Mentor junior team members and contribute to knowledge sharing and internal learning initiatives. Participate in architectural reviews and provide recommendations for continuous improvement and innovation in integration approaches. Support business development efforts by contributing to solution proposals, proof of concepts (POCs), and client presentations. Skills And Attributes For Success Use a solution-driven approach to design and implement compliant integration strategies for clinical data platforms like Medidata. Strong communication, stakeholder engagement, and documentation skills, with experience presenting complex integration concepts clearly. Proven ability to manage system-to-system data flows using APIs or middleware, ensuring alignment with enterprise architecture and regulatory standards To qualify for the role, you must have Experience: Minimum 8 years in data integration or architecture roles, with a strong preference for experience in clinical research or life sciences domains. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Hands-on expertise in one or more integration platforms such as Apache Kafka, Informatica, or similar middleware technologies; experience in implementing API-based integrations. Domain Knowledge: In-depth understanding of clinical trial data workflows, integration strategies, and regulatory frameworks including GxP and CSV compliance. Soft Skills: Strong analytical thinking, effective communication, and stakeholder management skills with the ability to collaborate across business and technical teams. Additional Attributes: Ability to work independently in a fast-paced environment, lead integration initiatives, and contribute to solution design and architecture discussions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks. Familiarity with broader clinical R&D platforms such as Oracle Clinical, RAVE, or other EDC systems. Prior experience leading small integration teams and working directly with cross-functional stakeholders in regulated environments What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and Consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
12.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
0.0 - 15.0 years
83 - 104 Lacs
Delhi, Delhi
On-site
Job Title: Data Architect (Leadership Role) Company : Wingify Location : Delhi (Outstation Candidates Allowed) Experience Required : 10 – 15 years Working Days : 5 days/week Budget : 83 Lakh to 1.04 Cr About Us We are a fast-growing product-based tech company known for its flagship product VWO—a widely adopted A/B testing platform used by over 4,000 businesses globally, including Target, Disney, Sears, and Tinkoff Bank. The team is self-organizing, highly creative, and passionate about data, tech, and continuous innovation. About us Company Size: Mid-Sized Industry : Consumer Internet, Technology, Consulting Role & Responsibilities Lead and mentor a team of Data Engineers, ensuring performance and career development. Architect scalable and reliable data infrastructure with high availability. Define and implement data governance frameworks, compliance, and best practices. Collaborate cross-functionally to execute the organization’s data roadmap. Optimize data processing workflows for scalability and cost efficiency. Ensure data quality, privacy, and security across platforms. Drive innovation and technical excellence across the data engineering function. Ideal Candidate Must-Haves Experience : 10+ years in software/data engineering roles. At least 2–3+ years in a leadership role managing teams of 5+ Data Engineers. Proven hands-on experience setting up data engineering systems from scratch (0 → 1 stage) in high-growth B2B product companies. Technical Expertise: Strong in Java (preferred), or Python, Node.js, GoLang. Expertise in big data tools: Apache Spark, Kafka, Hadoop, Hive, Airflow, Presto, HDFS. Strong design experience in High-Level Design (HLD) and Low-Level Design (LLD). Backend frameworks like Spring Boot, Google Guice. Cloud data platforms: AWS, GCP, Azure. Familiarity with data warehousing: Snowflake, Redshift, BigQuery. Databases: Redis, Cassandra, MongoDB, TiDB. DevOps tools: Jenkins, Docker, Kubernetes, Ansible, Chef, Grafana, ELK. Other Skills: Strong understanding of data governance, security, and compliance (GDPR, SOC2, etc.). Proven strategic thinking with ability to align technical architecture to business objectives. Excellent communication, leadership, and stakeholder management. Preferred Qualifications Exposure to Machine Learning infrastructure / MLOps. Experience with real-time data analytics. Strong foundation in algorithms, data structures, and scalable systems. Previous work in SaaS or high-growth startups. Screening Questions Do you have team leadership experience? How many engineers have you led? Have you built a data engineering platform from scratch? Describe the setup. What’s the largest data scale you’ve worked with and where? Are you open to continuing hands-on coding in this role? Interested candidates applies on deepak.visko@gmail.com or 9238142824 . Job Types: Full-time, Permanent Pay: ₹8,300,000.00 - ₹10,400,000.00 per year Work Location: In person
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough