Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
30 - 35 Lacs
Chennai
Work from Office
Job Summary: This position is on the Connectivity Enabled Solutions Team in Connectivity Department within the CAT Digital organization. This team has accountability for Building common datasets to support Connectivity initiatives. Streamline reporting of data through various sources, enhancing business logic to improve data reporting, and delivering visualize data in a consistent way for internal and external business users. Building solutions to monitor and improve telematics data quality. Responsibilities: Contributes to design, development, code review, and deployment. Understanding the business requirements for new features and working with business and business analysts for gathering & refining business requirements and to recommend any changes which could do business impact. Competent to perform all programming, project management, and development assignments without close supervision; normally assigned the more complex aspects of systems work. Preparing the technical design documents based on the business requirements. Design & build new features leveraging out of the box components as well as extending/customizing where necessary to meet business needs. Works directly on complex application/technical problem identification and resolution. Leading Development & unit/integration testing of new features for the scrum team. Performing Code Reviews, Performance Assessments, Architecture discussions. Ensuring teams overall code quality and velocity For newly released features, work with the support team for addressing critical production issues in a timely fashion. Works independently on complex systems or infrastructure components that may be used by one or more systems. Drives application development focused around delivering valuable business features. Mentor and assist software engineers, providing technical assistance and direction as needed. Identifies and encourages areas for growth and improvement within the team. Basic Qualifications: Educational Background: A four-year degree from an accredited college or university is required. A master's degree in computer science or a related field is preferred. Python Development: 8+ years of experience in designing and developing software applications in Java, or SQL: 6+ years of experience in SQL, relational databases such as MySQL, PostgreSQL, Snowflake, NoSQL database, etc. Database Management : 5+ years of experience in database design, development, and administration. Familiarity with database replication, clustering, and high availability solutions. Big data and data modeling: 3+ years of Hands-on experience in data modeling, ETL processes, Data pipeline development and data warehousing Cloud Database Services: 3+ years of experience with cloud-based database services, particularly within AWS, Azure, or Google Cloud environments. API: 4+ years of experience designing, building, and integrating Rest API, SOAP API Familiarity with design patterns is essential. Strong understanding and/or experience in many of the following: Batch or stream processing systems such as Apache Spark, Flink, Akka, Storm Message brokers such as Kafka, AWS SQS, AWS SNS, Apache ActiveMQ, Kinesis AWS environment such as Lambda functions, SQS, Glue, Step Functions Python data libraries such as Pandas, NumPy, Airflow, PySpark, Matplotlib, etc. Advance SQL such as CTEs, functions, procedures, pivoting, window functions, performance tuning, etc. Snowflake environment such as task, procedures, streams, Snowpark, Streamlit, AI/ML studio, etc. Defining and integrating CI/CD pipelines in the development process Datastores such as Snowflake, MongoDB, Cassandra, Redis, Elasticsearch, MySQL, Oracle, etc. Container Platforms and Orchestration such as Docker, ECR, Kubernetes, Padman, etc. Git repositories, code versioning, development and integration of CI/CD pipelines, etc. This position requires working onsite five days a week Role Definition Performs implementation, regular problem solving, maintenance and support for a agile software development. Responsibilities Designing, modifying, developing, writing and implementing software programming applications for target system using agile methods. Acquiring client requirements; resolving workflow problems through automation optimization. Writing source codes for new applications, and/or generating and enhancing code samples for existing applications. Utilizing automated testing tools to perform the testing and maintenance. Skill Descriptors Decision Making and Critical Thinking: Knowledge of the decision-making process and associated tools and techniques; ability to accurately analyze situations and reach productive decisions based on informed judgment. Level Working Knowledge: Applies an assigned technique for critical thinking in a decision-making process. Identifies, obtains, and organizes relevant data and ideas. Participates in documenting data, ideas, players, stakeholders, and processes. Recognizes, clarifies, and prioritizes concerns. Assists in assessing risks, benefits and consideration of alternatives. Effective Communications: Understanding of effective communication concepts, tools and techniques; ability to effectively transmit, receive, and accurately interpret ideas, information, and needs through the application of appropriate communication behaviors. Level Working Knowledge: Delivers helpful feedback that focuses on behaviors without offending the recipient. Listens to feedback without defensiveness and uses it for own communication effectiveness. Makes oral presentations and writes reports needed for own work. Avoids technical jargon when inappropriate. Looks for and considers non-verbal cues from individuals and groups. Software Development: Knowledge of software development tools and activities; ability to produce software products or systems in line with product requirements. Level Extensive Experience: Conducts walkthroughs and monitors effectiveness and quality of the development activities. Elaborates on multiple-development toolkits for traditional and web-based software. Has participated in development of multiple or large software products. Contrasts advantages and drawbacks of different development languages and tools. Estimates and monitors development costs based on functional and technical requirements. Provides consulting on both selection and utilization of developers' workbench tools. Software Development Life Cycle: Knowledge of software development life cycle; ability to use a structured methodology for delivering and managing new or enhanced software products to the marketplace. Level Working Knowledge: Describes similarities and differences of life cycle for new product development vs. new release. Identifies common issues, problems, and considerations for each phase of the life cycle. Works with a formal life cycle methodology. Explains phases, activities, dependencies, deliverables, and key decision points. Interprets product development plans and functional documentation. Software Integration Engineering: Knowledge of software integration processes and functions; ability to design, develop and maintain interfaces and linkage to alternative platforms and software packages. Level Working Knowledge: Has experience with designing data exchange interfaces to and from software product. Describes tools and techniques for extraction, transformation and loading of electronic data. Cites examples of common linkage requirements for software products and vendors. Works with integrating software into the customer or partner framework and infrastructure. Participates in the development of technology interfaces and bridges. Software Product Design/Architecture: Knowledge of software product design; ability to convert market requirements into the software product design. Level Extensive Experience: Demonstrates experience with the architecture and design of major or multiple products. Describes major software architecture alternatives and considerations. Explains design considerations for commercial database systems, operating systems and web. Displays experience in estimating the cost of a specific design of a proposed product. Facilitates design reviews and walkthroughs. Analyzes benefits and drawbacks of specific software designs and architecture. Software Product Technical Knowledge: Knowledge of technical aspects of a software product; ability to design, configure and integrate technical aspects of software products. Level Working Knowledge: Maintains and utilizes data related to install base configurations and environments. Solicits customer feedback, reports and monitors bugs and implementation issues. Participates in defining and conducting technical acceptance tests. Participates in creating technical requirements for software development and deployment. Explains basic environment and product configuration options. Software Product Testing: Knowledge of software product testing; ability to design, plan, and execute testing strategies and tactics to ensure software product quality and adherence to stated requirements. Level Working Knowledge: Participates in test readiness reviews, functional, volume, and load testing. Describes key features and aspects of a specific testing discipline or methodology. Tests software components for compliance with functional requirements and design specifications. Explains procedures for documenting test activities and results (e.g. errors, non-conformance, etc.) Conducts functional and performance testing on aspects of assigned products.
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
FEQ426R101 As a Senior Solutions Architect, you will shape the future of the Data & AI landscape by working with the most sophisticated data engineering and data science teams in the world. You will be a technical advisor internally to the sales team, and work with the product team as an advocate of your customers in the field. You will help our customers to achieve tangible data-driven outcomes through the use of our Databricks Lakehouse Platform, helping data teams complete projects and integrate our platform into their enterprise Ecosystem. Youll grow as a leader in your field, while finding solutions to our customers biggest challenges in big data, analytics, data engineering and data science problems Reporting to the Field Engineering Manager, you will collaborate with our most strategic prospects and customers, work directly with product and engineering to drive the Databricks roadmap forward, and work with the broader customer-facing team to develop architectures and solutions using our platform. You will guide customers through the competitive landscape, best practices, and implementation, and develop technical champions along the way. The impact you will have: You will partner with the sales team and provide technical leadership to help customers understand how Databricks can help solve their business problems. You will work directly with the sales team to develop your book of business, define account strategies, and execute those strategies to help your customers and prospects solve their business problems with Databricks. Consult on Big Data architectures, implement proof of concepts for strategic projects, spanning data engineering, data science and machine learning, and SQL analysis workflows. As well as validating integrations with cloud services, home grown tools, and other 3rd party applications Collaborate with your fellow Solutions Architects, using your skills to support each other and our users Become an expert in, promote, and recruit contributors for Databricks inspired open-source projects (Spark, Delta Lake, and MLflow) across the developer community. What we look for: 8+ years in a data engineering, data science, technical architecture, or similar pre-sales/consulting role 8+ years of experience with Big Data technologies, including Apache Spark , AI, Data Science, Data Engineering, Hadoop, Cassandra, and others Strong consulting / customer facing experience , working with external clients across a variety of industry markets Coding experience in Python, R, Java, Apache Spark or Scala Experience building distributed data systems Have built solutions with public cloud providers such as AWS, Azure, or GCP Available to travel to customers in your region Nice to have: Databricks Certification About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide including Comcast, Cond Nast, Grammarly, and over 50% of the Fortune 500 rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter , LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https: / / www.mybenefitsnow.com / databricks . Our Commitment to Diversity and Inclusion . Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employers discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 month ago
8.0 - 12.0 years
12 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Enable Skills-Based Hiring No Description Comments Additional Details Description Comments : 8-12 years of experience in software application development with BE/B.Tech/MCA. Strong experience in design, development using micro-services architecture (using Java / SpringBoot, Angular, Python etc.,) for deployment to cloud technologies (preferably Pivotal Cloud Foundry). Deep knowledge and experience in design, development of applications on both relational (Oracle) and No-SQL(MongoDB, Cassandra, etc.,) skills. Proven overall Architecture and Design skills and able to find creative scalable solutions to complex problems. Desirable Requirements Self-directed with strong sense of ownership, excellent communication skills, and capable of working effectively in a dynamic environment. Familiarity with XP/Agile/SCRUM development methodologies and software development lifecycle principles Not to Exceed Rate : (No Value)
Posted 1 month ago
9.0 - 13.0 years
30 - 45 Lacs
Noida
Work from Office
Job Summary: The role involves designing and developing solutions to support the business needs. Optimizing and tuning existing programs and developing new routines will be an integral part of the profile. Key Responsibility Areas: Architect, Design and Develop solutions to support business requirements. Use skill sets to Analyze and manage a variety of database environments such as Oracle, Postgres, Cassandra, MySql, Graph DB, etc Provide optimal design of database environments, analysing complex distributed production deployments, and making recommendations to optimize performance. Work closely with programming teams to deliver high quality software. Provide innovative solutions to complex business and technology problems. Propose best solution in Logical and Physical Data Modelling. Perform Administration tasks including DB resource planning and DB tuning. Mentor and train junior developers, lead & manage teams. Skill Sets / Requirements: Experience of designing/architect database solutions Experience with multiple RDBMS and NoSql databases of TB data size preferably Oracle, PostgreSQL, Cassandra and Graph DB. Must be well versed in PL/SQL & PostgreSQL and Strong Query Optimization skills. Expert knowledge in DB installation, configuration, replication, upgradation, security and HADR set up. Experience in database deployment, performance and / troubleshooting issues. Knowledge of scripting languages (such and Unix, shell, PHP). Advanced knowledge of PostgreSQL will be preferred. Experience working with Cloud Platforms and Services Experience with migrating database environments from one platform to another Ability to work well under pressure Experience with big data technologies and DWH is a plus. Experience: 10+ years of experience in a Data Engineering role. Bachelors degree in Computer Science or related experience. Qualification: B.Tech. Location: Noida Sector 135
Posted 1 month ago
5.0 - 10.0 years
8 - 13 Lacs
Bengaluru
Work from Office
A Senior R&D Engineer, you will be responsible for designing, developing, and maintaining high-quality software solutions with an expertise in Java and Spring Boot. You also have experience in UML modeling, JSON schema, and NoSQL databases. You should have strong skills in cloud-native development and microservices architecture, with additional knowledge in scripting and Helm charts. You have: Bachelors or masters degree in computer science, Engineering, or a related field. 5+ years of experience in software development with a focus on Java and Spring Boot. Exposure to CI/CD tools (Jenkins, GitLab CI). It would be nice if you also had: Understanding of RESTful API design and implementation. Relevant certifications (e.g., AWS Certified Developer, Oracle Certified Professional Java SE) are a plus. Knowledge of container orchestration and management. Familiarity with Agile development methodologies. Design and develop high-quality applications using Java and Spring Boot, implementing and maintaining RESTful APIs and microservices. Create and maintain UML diagrams for software architecture, define and manage JSON schemas, and optimize NoSQL databases like Neo4j, MongoDB, and Cassandra for efficient data handling. Develop and deploy cloud-native applications using AWS, Azure, or OCP, ensuring scalability and resilience in microservices environments. Manage Kubernetes deployments with Helm charts, collaborate with DevOps teams to integrate CI/CD pipelines, and automate tasks using Python and Bash scripting. Ensure efficient data storage and retrieval, optimize system performance, and support automated deployment strategies. Maintain comprehensive documentation for software designs, APIs, and deployment processes, ensuring clarity and accessibility.
Posted 1 month ago
5.0 - 10.0 years
8 - 13 Lacs
Bengaluru
Work from Office
As a Senior R&D Engineer, youll be at the forefront of designing and developing cutting-edge, cloud-native applications using Java and Spring Boot. Youll have the chance to craft customer-focused solutions that make a real impact, leveraging microservices architecture, NoSQL databases, and containerized deployments to power high-performance enterprise applications. You have: Bachelors or Masters degree in Computer Science, Engineering, or a related field 5+ years of experience in software development, specializing in Java and Spring Boot, with expertise in microservices architecture Expertise in NoSQL databases (Neo4j / MongoDB / Cassandra), RESTful API design, containerization technologies (Docker / Kubernetes / Helm), and scripting language (Python / Bash) for automation Proficiency in software modeling (UML, JSON schema design), DevOps practices, and tracing tools (Wireshark, tShark) for debugging and performance tuning It would be nice if you also have: Exposure on cloud platforms (AWS, Azure, OCP) and CI/CD tools (Jenkins, GitLab CI) Develop and optimize enterprise applications using Java and Spring Boot. Design and implement JSON schemas, UML models, and NoSQL databases (Neo4j, MongoDB, Cassandra). Build and deploy cloud-native solutions on AWS, Azure, or OCP. Implement microservices architecture, using Docker, Kubernetes, and Helm charts for deployment. Automate workflows with scripting languages (Python, Bash). Ensure robust RESTful API design, cloud storage integration, and serverless computing. Drive innovation and process improvements in an Agile development environment.
Posted 1 month ago
2.0 - 10.0 years
20 - 25 Lacs
Pune
Work from Office
Join us as a Java Full Stack Developer at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Java Full Stack Developer, you should have experience with: Programming Core Java, Collections, Multi-Threading and Concurrency, OOPS concepts, Exception Handling, JVM Concepts, Spring Framework (Spring Boot, Spring Batch, Spring Integration), SQL Integration Micro Service Architecture, Developing and integration with Restful web services, Design Patterns UI/UX Angular, React, HTML/CSS/JS Some other highly valued skills includes : Devops - Monitoring and tooling like ELK and App Dynamics, Build and Deployment tools, Docker, Kubernetes, Load Balancer principles, Experience working on highly scalable applications Database and Messaging - SQL (Joins Indexing Transaction), No SQL( Mongo, Cassandra, CAP Theorem etc. ), SQL Queries, Query Optimizations etc. Caching Framework Concepts, Types of Caching, Principles of caching. Priming, Eviction, Cache Miss, Consistency staleness, MRU etc; Messaging Kafka, Solace You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. The role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave.
Posted 1 month ago
0.0 years
14 - 19 Lacs
Coimbatore
Work from Office
Solution Architects assess a projects technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architects role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certificationpreferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management StorageArchitecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication
Posted 1 month ago
0.0 years
20 - 25 Lacs
Mumbai
Work from Office
Solution Architects assess a projects technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architects role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certificationpreferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management StorageArchitecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication
Posted 1 month ago
1.0 - 4.0 years
4 - 7 Lacs
Chennai
Work from Office
Proficiency in Java, J2EE, Spring, Spring Boot, and Microservices Experience working with any database (MS SQL, Oracle, MYSQL) Must implement projects on AWS Cloud Good to have Angular JS or React Good to have DevOps Knowledge CI/CD Pipelines, Docker or Kubernetes Preferred experience with XML, XSLT Preferred experience with NoSQL, such as Cassandra, Solr, ES, Redis Preferred experience with message queues technologies Experience working in Source Code Management GIT Experience of developing software within the Scrum framework
Posted 1 month ago
4.0 - 8.0 years
4 - 8 Lacs
Nagpur
Work from Office
About Company: At Delaplex, we believe true organizational distinction comes from exceptional products and services. Founded in 2008 by a team of like-minded business enthusiasts, we have grown into a trusted name in technology consulting and supply chain solutions. Our reputation is built on trust, innovation, and the dedication of our people who go the extra mile for our clients. Guided by our core values, we dont just deliver solutions, we create meaningful impact. Responsibilities: Design, develop, and optimize database schemas, stored procedures, functions, and triggers. Write and optimize complex SQL queries. Work with NoSQL databases for specific data storage needs. Perform database performance tuning and optimization. Collaborate with development teams to integrate database solutions into applications. Monitor database performance and troubleshoot issues. Ensure data security and integrity. Skills & Qualifications: Experience: 4 yrs., Proficiency in SQL (e.g., strong experience with relational databases). Experience with NoSQL databases (e.g., MongoDB, Cassandra, Redis). Understanding of database design principles. Ability to troubleshoot and optimize database performance. Strong problem-solving skills.
Posted 1 month ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Senior Engineer Location: Bengaluru, India - (Hybrid) At Reltio , we believe data should fuel business success. Reltio s AI-powered data unification and management capabilities encompassing entity resolution, multi-domain master data management (MDM), and data products transform siloed data from disparate sources into unified, trusted, and interoperable data. Reltio Data Cloud delivers interoperable data where and when its needed, empowering data and analytics leaders with unparalleled business responsiveness. Leading enterprise brands across multiple industries around the globe rely on our award-winning data unification and cloud-native MDM capabilities to improve efficiency, manage risk and drive growth. At Reltio, our values guide everything we do. With an unyielding commitment to prioritizing our Customer First , we strive to ensure their success. We embrace our differences and are Better Together as One Reltio. We are always looking to Simplify and Share our knowledge when we collaborate to remove obstacles for each other. We hold ourselves accountable for our actions and outcomes and strive for excellence. We Own It . Every day, we innovate and evolve, so that today is Always Better Than Yesterday . If you share and embody these values, we invite you to join our team at Reltio and contribute to our mission of excellence. Reltio has earned numerous awards and top rankings for our technology, our culture and our people. Reltio was founded on a distributed workforce and offers flexible work arrangements to help our people manage their personal and professional lives. If you re ready to work on unrivaled technology where your desire to be part of a collaborative team is met with a laser-focused mission to enable digital transformation with connected data, let s talk! Job Summary: Core Platform development is spread across multiple cross-functional teams, each building large and complex components of the MDM platform. Engineer plays senior role in a team is technically responsible for particular feature delivery which consists of: Presenting a solution to Architects Technical drive feature delivery, make decisions, development leadership - distribute tasks across one or two regular engineers Work with QA to review testing approaches, provide all details needed for testing, and review test plans Be responsible for all internal feature documentation Job Duties and Responsibilities: Deliver features with high quality, and be customer-focused. Champion engineering excellence, raise the bar on code coverage, coding guidelines, and review processes. Provide technical leadership and mentor junior team members. Deal with ambiguity in complex problems, show bias for action. Work closely with architects to actively shape the architecture of your work area and the broader product. Work closely with PM, QA & Doc teams to ensure a thorough testing strategy, robust deployment, and lucid documentation of features. Effective communication with cross-functional teams and global stakeholders. Skills You Must Have: Solid foundation in computer science, with strong competencies in algorithms, data structures, software design, and building large, distributed systems 5+ years of experience in the design and development of products for enterprise customers. 4+ years of experience in building large-scale distributed data systems in Java and cloud technologies. Proven track record of building functionalities from conception to delivery, engaging with customers and partners to drive successful adoption and customer satisfaction. Experience in leading projects with a crew of engineers, providing technical mentorship. Experience building services on one of the 3 major cloud service providers: AWS, Azure or GCP, hands-on experience in managed cloud databases Skills That Are Nice to Have: Experience working with NoSQL databases like Cassandra, and queue services like SQS Experience in cloud security principles Experience with Kubernetes Experience with big data technologies Experience in driving customer focus for SaaS products is a big plus Why Join Reltio?* Health & Wellness: Comprehensive Group medical insurance including your parents with additional top-up options. Accidental Insurance Life insurance Free online unlimited doctor consultations An Employee Assistance Program (EAP) Work-Life Balance: 36 annual leaves, which includes Sick leaves - 18, Earned Leaves - 18 26 weeks of maternity leave, 15 days of paternity leave Very unique to Reltio - 01 week of additional off as recharge week every year globally Support for home office setup: Home office setup allowance. Stay Connected, Work Flexibly: Mobile & Internet Reimbursement No need to pack a lunch we ve got you covered with a free meal.And many more ..
Posted 1 month ago
6.0 - 11.0 years
18 - 33 Lacs
Bengaluru
Work from Office
Job Description Exp : 6+ Years Location : Bangalore (Manyta Tech Park) TECHNICAL SKILLS Must Have : Apache Kafka develop scalable microservices Java NoSQL SQL Nice To Have (mandatory 1 ) Cassandra Mongo DB PostgreSQL Job Description: As a Backend Software Engineer, you will be responsible for designing, developing, and maintaining server-side applications. You will collaborate with cross-functional teams to ensure seamless integration of various components and deliver high-performance, scalable solutions. Key Responsibilities: Design, develop, and maintain robust backend systems and APIs. Collaborate with front-end developers, product managers, and other stakeholders to understand requirements and deliver effective solutions. Write clean, maintainable, and efficient code following best practices and coding standards. Conduct thorough testing and debugging to ensure high-quality software. Participate in code reviews to uphold code quality and share knowledge. Stay current with emerging backend technologies and methodologies, incorporating them as appropriate. Troubleshoot and resolve backend-related issues. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 5 to 10 years of experience in backend development. Proficiency in backend languages such as Java , Kotlin, Python Experience with database technologies like SQL, MySQL, PostgreSQL, or MongoDB. Strong understanding of RESTful APIs and microservices architecture. Familiarity with version control systems, preferably Git. Knowledge of cloud platforms (AWS, Azure, or Google Cloud) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Preferred Skills: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of serverless architecture. Familiarity with Agile methodologies and practices.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Bengaluru
Work from Office
Job Summary Synechron is seeking an experienced Big Data Developer with strong expertise in Spark, Scala, and Python to lead and contribute to large-scale data projects. The role involves designing, developing, and implementing robust data solutions that leverage emerging technologies to enhance business insights and operational efficiency. The successful candidate will play a key role in driving innovation, mentoring team members, and ensuring the delivery of high-quality data products aligned with organizational objectives. Software Requirements Required: Apache Spark (latest stable version) Scala (version 2.12 or higher) Python (version 3.6 or higher) Big Data tools and frameworks supporting Spark and Scala Preferred: Cloud platforms such as AWS, Azure, or GCP for data deployment Data processing or orchestration tools like Kafka, Hadoop, or Airflow Data visualization tools for data insights Overall Responsibilities Lead the development and implementation of data pipelines and solutions using Spark, Scala, and Python Collaborate with business and technology teams to understand data requirements and translate them into scalable solutions Mentor and guide junior team members on best practices in big data development Evaluate and recommend new technologies and tools to improve data processing and quality Stay informed about industry trends and emerging technologies relevant to big data and analytics Ensure timely delivery of data projects with high standards of quality, performance, and security Lead technical reviews, code reviews, and provide inputs to improve overall development standards and practices Contribute to architecture design discussions and assist in establishing data governance standards Technical Skills (By Category) Programming Languages: Essential: Spark (Scala), Python Preferred: Knowledge of Java or other JVM languages Data Management & Databases: Experience with distributed data storage solutions (HDFS, S3, etc.) Familiarity with NoSQL databases (e.g., Cassandra, HBase) and relational databases for data integration Cloud Technologies: Preferred: Cloud platforms (AWS, Azure, GCP) for data processing, storage, and deployment Frameworks & Libraries: Spark MLlib, Spark SQL, Spark Streaming Data processing libraries in Python (pandas, PySpark) Development Tools & Methodologies: Version control (Git, Bitbucket) Agile methodologies (Scrum, Kanban) Data pipeline orchestration tools (Apache Airflow, NiFi) Security & Compliance: Understanding of data security best practices and data privacy regulations Experience Requirements 5 to 10 years of hands-on experience in big data development and architecture Proven experience in designing and developing large-scale data pipelines using Spark, Scala, and Python Demonstrated ability to lead technical projects and mentor team members Experience working with cross-functional teams including data analysts, data scientists, and business stakeholders Track record of delivering scalable, efficient, and secure data solutions in complex environments Day-to-Day Activities Develop, test, and optimize scalable data pipelines using Spark, Scala, and Python Collaborate with data engineers, analysts, and stakeholders to gather requirements and translate into technical solutions Lead code reviews, mentor junior team members, and enforce coding standards Participate in architecture design and recommend best practices in big data development Monitor data workflows performance and troubleshoot issues to ensure data quality and reliability Stay updated with industry trends and evaluate new tools and frameworks for potential implementation Document technical designs, data flows, and implementation procedures Contribute to continuous improvement initiatives to optimize data processing workflows Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or a related field Relevant certifications in cloud platforms, big data, or programming languages are advantageous Continuous learning on innovative data technologies and frameworks Professional Competencies Strong analytical and problem-solving skills with a focus on scalable data solutions Leadership qualities with the ability to guide and mentor team members Excellent communication skills to articulate technical concepts to diverse audiences Ability to work collaboratively in cross-functional teams and fast-paced environments Adaptability to evolving technologies and industry trends Strong organizational skills for managing multiple projects and priorities
Posted 1 month ago
8.0 - 12.0 years
17 - 20 Lacs
Bengaluru
Work from Office
We are currently seeking an experienced and visionary Data Architect to join our team. The successful candidate will lead the design and implementation of scalable and innovative data solutions. This role requires collaboration with various experts, including IoT specialists, data scientists, software engineers, and API architects, to develop high-quality data-driven platforms . RESPONSIBILITIES: Define, develop, manage & sustain core components of the data platform, such as: Multi-tenant data collection and storage Multi-tenant streaming and data processing Shared data model, including NoSQL modeling Data management and security components Multi-tenant and customizable analytical dashboards Develop IT solutions with partners (startups, IT companies, other industrial companies, suppliers, universities, research institutes) to package Software as a Service (SaaS) offering along a technical team of Data/DevOps engineers. Lead the design and architecture deployment of next-generation data solutions within a lean startup pathway, collaborating with data engineers, DevOps, engineering & mobility experts, data scientists, software engineers & HMI designers. Understand customer needs and provide an architecture design to meet these requirements. Own the overall technical vision of the solution regarding scalability, security, performance, reliability, and recovery. Build multi-tenant streaming and data processing capabilities in batch and near-real-time flows. Evaluate the opportunities from emerging technologies. Apply strong testing and quality assurance practices. EDUCATION : Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Preferred:Data Science and/or Machine Learning, Cybersecurity. Competencies & Skills: Extensive knowledge of data modeling, analytics, and software architecture (preferably Java and Python). Proven experience in designing and developing scalable solutions with data processing engines for production environments (e.g., Apache Spark). Experience in various data management platforms and cloud technologies (e.g., Apache NIFI, Kubernetes, Docker, Elasticsearch). Experience in designing databases MySQL, PostgreSQL, Cassandra, MongoDB and eliminating performance bottlenecks. Knowledge of cloud technologies (Azure, AWS, GCP), msfabric, lakehouse, deltalake Knowledge of data science/machine learning or experience designing data pipelines for ML models. Knowledge of network and security:SSL, certificates, IPSEC, Active Directory, LDAP. Experience using data governance tools:Collibra, Apache Atlas. Knowledge of the Elastic/ELK stack. Knowledge about Machine Learning with scikit-learn, R, Tensorflow, or another AI framework or toolkit. Proven experience in deploying and maintaining solutions on cloud and/or on-premise environments. Proven experience in providing technical guidance to teams. Proven experience in managing customer expectations. Proven track record of driving decisions collaboratively, resolving conflicts, and ensuring follow-through. Extensive knowledge of data processing and software development, in Python or Java/Scala environment. Proven experience in designing stable solutions, testing, and debugging. Demonstrated technical guidance with worldwide teams. Demonstrated teamwork and collaboration in a professional setting. Proven capabilities with worldwide teams. Proficient in English; proficiency in French is a plus. Performance Measurements: On-Time Delivery (OTD) Developments Quality, Cost, and Delivery (QCD)
Posted 1 month ago
8.0 - 10.0 years
12 - 18 Lacs
Noida
Work from Office
Primary Role Function: - Create and maintain optimal data pipeline architecture, - Assemble large, complex data sets that meet functional non-functional business requirements. - Experience with AWS cloud services: EC2, Glue, RDS, Redshift - Experience with big data tools: Hadoop, Spark, Kafka, etc. - Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. - Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. - Experience with object-oriented/object function scripting languages: Python. - Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. - Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. - Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. - Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. - Work with data and analytics experts to strive for greater functionality in our data systems. - Writes high quality and well-documented code according to accepted standards based on user requirements Knowledge: - Thorough in-depth knowledge of design and analysis methodology and application development processes - Exhibits solid knowledge of databases - Programming experience with extensive business knowledge - University degree in Computer Science, Engineering or equivalent industry experience - Solid understanding of SDLC and QA requirements Mandatory Competencies Data on Cloud - AWS S3 Cloud - AWS Python - Airflow Python - Python DevOps - Docker
Posted 1 month ago
2.0 - 5.0 years
3 - 7 Lacs
Mohali
Work from Office
We are seeking to hire brilliant RoR programmers for a lineup of high scale Blockchain implementations. This is a valuable opportunity for those who are bored of monotonous projects and want to engage into a completely new ecosystem. We believe in self-motivating virtues and that reflects in all our existing employees. Therefore, the zest for continuous learning should come naturally to you. At Antier, youll collaborate and learn from experienced Blockchain engineers and explore a progressive industry of dApps. Since we dont compromise on prerequisite expertise, we expect you to have adequate knowledge as per your years of experience in developing high-performing and secure web apps. As per your years of experience, you should have hands-on workability in - Identifying and resolving performance issues using various caching techniques, query optimizations etc. JavaScript, jQuery, and ability to write Object Oriented JS (Flight.Js will be plus) HTML and CSS; ability to integrate static HTML/CSS with Rails RSpec and Cucumber Designing high-performance APIs & consuming APIs Integrating full-text search engines like sphinx, Elastic Search and solr Deploying Rails app in a multi-host production environment using Capistrano on cloud servers AWS services like S3, EC2, SES, CloudFront etc Configuring and using Memcached, messaging queues, and noSQL databases like MongoDB/cassandra with web app Additionally, you should have - Good Understanding of MYSQL Experience of using Git Understanding of Agile methodologies Good hold on Ruby Metaprogramming Open Source Contribution MVC Frameworks
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Authorize.net makes it simple to accept electronic and credit card payments in person, online or over the phone. We ve been working with merchants and small businesses since 1996. As a leading payment gateway, Authorize.net is trusted by more than 445,000 merchants, handling more than 1 billion transactions and USD 149 billion in payments every year. As a Senior Staff Software Engineer at Authorize.net (a Visa solution), you will be a hands-on technical leader guide the development of major new features by translating complex business problems into technical solutions that resonate with our merchants and partners. You will also drive cross-team projects that standardize our approach to API development and data schemas, ensuring consistent implementation of best practices across the organization. Beyond features, you will also work on modernization, working across multiple teams to modernize our systems and deliver innovative online payment solutions. You will be instrumental in containerizing applications, splitting monolithic codebases into microservices, and migrating on-premises workloads to the cloud. In addition, you will enable process improvements through robust DevOps practices, incorporating comprehensive release management strategies and optimized CI/CD pipelines. Collaborating with product managers, tech leads, and engineering teams, you will define technology roadmaps, communicate architectural decisions, and mentor engineers in advanced technical approaches. This position requires a solid track record of delivering large-scale, reliable, and secure software solutions. While we prefer C# expertise, knowledge of other modern programming languages is also welcome. Basic Qualifications 15+ years of relevant work experience with a Bachelor s Degree or with an Advanced degree. Advanced level coding skills in C#, .Net Core, ASP.Net. Java experience is a plus Solid experience
Posted 1 month ago
5.0 - 7.0 years
4 - 8 Lacs
Pune
Work from Office
We are looking for a skilled PostgreSQL Expert with 5 to 7 years of experience in the field. The ideal candidate should have expertise in GCP Cloud SQL knowledge, DB DDL, DML, and production support. This position is located in Pune. Roles and Responsibility Design, develop, and implement database architectures using PostgreSQL. Develop and maintain databases on GCP Cloud SQL. Ensure high availability and performance of database systems. Troubleshoot and resolve database-related issues. Collaborate with cross-functional teams to identify and prioritize database requirements. Implement data security and access controls. Job Strong knowledge of PostgreSQL and GCP Cloud SQL. Experience with DB DDL, DML, and production support. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database design principles and best practices.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Data Software Engineer with 5 to 12 years of experience in Big Data and related technologies. The ideal candidate will have expertise in distributed computing principles, Apache Spark, and hands-on programming with Python. Roles and Responsibility Design and implement Big Data solutions using Apache Spark and other relevant technologies. Develop and maintain large-scale data processing systems, including stream-processing systems. Collaborate with cross-functional teams to integrate data from multiple sources, such as RDBMS, ERP, and files. Optimize performance of Spark jobs and troubleshoot issues. Lead a team efficiently and contribute to the development of Big Data solutions. Experience with native Cloud data services, such as AWS or AZURE Databricks. Job Expert-level understanding of distributed computing principles and Apache Spark. Hands-on programming experience with Python and proficiency with Hadoop v2, Map Reduce, HDFS, and Sqoop. Experience with building stream-processing systems using technologies like Apache Storm or Spark-Streaming. Good understanding of Big Data querying tools, such as Hive and Impala. Knowledge of ETL techniques and frameworks, along with experience with NoSQL databases like HBase, Cassandra, and MongoDB. Ability to work in an AGILE environment and lead a team efficiently. Strong understanding of SQL queries, joins, stored procedures, and relational schemas. Experience with integrating data from multiple sources, including RDBMS (SQL Server, Oracle), ERP, and files.
Posted 1 month ago
8.0 - 10.0 years
4 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Core Java Developer with 8 to 10 years of experience. The ideal candidate should have expertise in Core Java, Spring, SQL, Kafka, and AWS. Roles and Responsibility Design, develop, and test software applications using Core Java and Spring. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale Java-based systems with high performance and scalability. Troubleshoot and resolve complex technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with industry trends and emerging technologies to enhance skills and knowledge. Job Strong proficiency in Core Java, Spring, SQL, Kafka, and AWS. Experience with Big Data Ecosystems such as Hadoop, Spark, Kafka, Hive, and Cassandra. Familiarity with messaging frameworks like Kafka and RabbitMQ. Knowledge of open-source frameworks like Spring IO, Spring MVC, Spring Hibernate, and Spring Boot. Proficiency in tools like Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence. Understanding of Agile methodology and experience working with IDEs. Ability to develop SOA services and possess good knowledge of REST API and Microservice architectures. Experience with profiling, code coverage, logging, and other development tools.
Posted 1 month ago
4.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are looking for a skilled Senior .Net AZURE Developer with 4 to 8 years of experience. The ideal candidate will have expertise in designing complex software microservices utilizing SOLID principles and apt design patterns for the cloud. Roles and Responsibility Design and develop large and complex applications from scratch, contributing to at least two past assignments. Interface with stakeholders and drive requirements estimations and reviews. Work with Agile SCRUM teams and possess an agile mindset. Develop data-intensive applications with a focus on performance optimization. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure application security and create test case scenarios. Job Experience with C#.NET Core, Azure Cloud, and API/GraphQL messaging. Knowledge of Nodejs, In Memory Caching (e.g., Redis), and databases such as CosmosDB, MongoDB, SQL Server, or Cassandra. Familiarity with Databricks, GIT, Hub, Terraform, and CI/CD tools like Jenkins. Understanding of virtualization technologies like Docker and cloud platforms like Azure. Experience with MicroService architecture, SQL Server, Swagger, DevOps, continuous delivery, LINQ, JSON, XML serialization and deserialization. At least 5 years of experience integrating service endpoints with both UI and backend. Experience working with SPA frontend and data-intensive web applications. Active coding experience and ability to work within agile teams, taking responsibility for complete delivery modules/tasks assigned. Experience with Agile tools like Azure DevOps. Experience working in teams with DevSecOps responsibilities.
Posted 1 month ago
4.0 - 5.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Java Developer: - What youll be responsible for ? Should perform the Software design and coding, maintenance, performance tuning. Should understand the use cases and implement it Develop the new module as well as support the existing one. Interprets business plans for automation requirements. Ongoing support of existing java project and new development. Creates technical documentation and specifications. Ability to plan, organize, coordinate, and multitask. Excellent communication in English (written & verbal) and interpersonal skills. What you'd have ? 4 - 5 yrs of Experience in developing resilient & scalable distributed systems and microservices architecture. Strong technical background in Core Java, Servlets, XML RDBMS. Experience in developing REST API's using spring boot (or similar frameworks) and webhooks for async communication. Good understanding of async architecture using queues and messaging broker like RabbitMQ, Kafka, etc Deep insights in Java, Garbage Collection Systems, Multi-threading. Experience in container platforms like Docker, Kubernetes. Good understanding of the working of Kubernetes and exp in EKS, GKE, AKS. Significant experience with various open-source tools and frameworks like Spring, hibernate, Apache Camel, Guava Cache, etc. Along with RDBMS, exposure to various no-SQL databases like Mongo, Redis, Click house, Cassandra. Good analytical skills.
Posted 1 month ago
7.0 - 10.0 years
9 - 12 Lacs
Hyderabad, Gurugram
Work from Office
The Team: Quality Engineering team works in partnership with other functions in Technology & the business to deliver quality products by providing software testing services and quality assurance, that continuously improve our customers ability to succeed. The team is independent in driving all decisions and is responsible for the architecture, design and quick turnaround in development of our products with high quality. The team is located globally. The Impact: You will ensure the quality of our deliverable meets and exceeds the expectations of all stakeholders and evangelize the established quality standards and processes. Your challenge will be reducing the the time to market for products without compromising the quality, by leveraging technology and innovation. These products are directly associated to revenue growth and operations enablement. You strive to achieve personal objectives and contribute to the achievement of team objectives, by working on problems of varying scope where analysis of situations and/or data requires a review of a variety of factors. Whats in it for you: Do you love working every single day testing enterprise-scale applications that serve a large customer base with growing demand and usage? Be the part of a successful team which works on delivering top priority projects which will directly contribute to Companys strategy. You will use a wide range of technologies and have the opportunity to interact with different teams internally. You will also get a plenty of learning and skill-building opportunities with participation in innovation projects, training and knowledge sharing. You will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities: Understand application architecture, system environments (ex: shared resources, components and services, CPU, memory, storage, network, etc.) to troubleshoot production performance issues. Ability to perform scalability & capacity planning. Work with multiple product teams to design, create, execute, and analyze performance tests; and recommend performance turning. Support remediating performance bottlenecks of application front-end and database layers. Drive industry best practices in methodologies and standards of performance engineering, quality and CI/CD process. Understand user behaviors and analytics models and experience in using Kibana and Google analytics Ensure optimally performing production applications by establishing application and transaction SLAs for performance, implementing proactive application monitoring, alarming and reporting, and ensuring adherence to and measurement against defined SLA. Analyzes, designs and develops performance specifications and scripts based on workflows. Ability to interpret Network/system diagram, results of performance tests and identify improvements. Leverage tools and frameworks to develop performance scripts with quality code to simplify testing scenarios Focus on building efficient solutions for Web, Services/APIs, Database, mobile performance testing requirements. Deliver projects in the performance testing space and ensure delivery efficiency. Define testing methodologies & implement tooling best practices for continuous improvement and efficiency Understand business scenarios in depth to define workload modelling for different scenarios Compliment architecture community by providing inputs & pursue implementation suggested for optimization Competency to manage testing for highly integrated system with multiple dependencies and moving parts. Active co-operation/collaboration with the teams at various geographic locations. Provide prompt response and support in resolving critical issues (along with the development team). May require after hours/weekend work for production implementations What were looking for: Basic Required Qualifications: Bachelors or PG degree in Computer Science, Information Systems, or equivalent. 7-10 years of hands-on experience in performance testing/engineering or software development. Strong experience with LoadRunner, JMeter, and tools like DevTools, Fiddler, and various APM platforms (AppDynamics, Dynatrace, Datadog). Proficient in one or more programming languages: Java, C#, Python, .NET. Key Soft Skills: Analytical mindset with exceptional problem-solving skills. Strong written and verbal communication; able to explain complex technical issues clearly. Ability to work collaboratively across global teams. Passion for quality, performance, and innovation. Additional Preferred Qualifications: Familiarity with protocols like Web(HTTP/HTML), Ajax Truclient, Citrix, .NET. Experience in databases (SQL Server, Cassandra, MongoDB, Postgres) and message brokers (Kafka). Working knowledge of cloud platforms (AWS, Azure), Docker, and modern JavaScript frameworks (AngularJS, NodeJS, ReactJS). Experience with CI/CD pipelines and Agile methodology. Preferred Soft Skills: Flexibility to adapt in a fast-paced, ever-changing environment. Ability to lead and mentor others in performance testing best practices. Enthusiasm for learning and self-development. Strong attention to detail and quality orientation. Preferred Qualifications: Bachelor's or higher degree in technology related field.
Posted 1 month ago
0.0 - 2.0 years
1 - 5 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Ascendeum is looking for veterans with extensive hands-on experience in the field of data engineering to build cutting-edge solutions for large-scale data extraction, processing, storage, and retrieval About Us: We provide AdTech strategy consulting to leading Internet websites and apps globally hosting over 200 million monthly worldwide audiences Since 2015, our team of consultants and engineers have been consistently delivering intelligent solutions that enable enterprise-level websites and apps to maximize their digital advertising returns Job Responsibilities: Understand long-term and short-term business requirements to precisely match them with the capabilities of different distributed storage and computing technologies from the plethora of options available in the ecosystem Create complex data processing pipelines Design scalable implementations of the models developed by our Data Scientists Deploy data pipelines in production systems based on CICD practices Create and maintain clear documentation on data models/schemas as well as transformation/validation rules Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers Desired Skills and Experience: 4+ years of overall industry experience building and deploying large scale data processing pipelines in a production environment Experience building data pipelines and data centric applications using distributed storage platforms such as HDFS, S3, NoSql databases (Hbase, Cassandra, etc); and distributed processing platforms such as Hadoop, Spark, Hive, Oozie, Airflow, etc Hands on experience with MapR, Cloudera, Hortonworks, and/or Cloud (AWS EMR, Azure HDInsights, Qubole, etc ) based Hadoop distributions Practical experience working with well know data engineering tools and platforms Kafka, Spark, Hadoop Solid understanding of Data Modelling, ML and AI concepts Fluent in programming languages like Nodejs/Java/Python Education: B E / B Tech /M tech / MS Thank you for your interest in joining Ascendeum
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France