Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will collaborate closely with a diverse team of developers, QA engineers, and product owners within a dynamic and innovative environment. Every day will present you with exciting new challenges that will inspire and drive you to excel. Your responsibilities will include designing and developing software applications to support business objectives, maintaining and enhancing existing applications, analyzing and debugging applications in both development and production environments, and resolving production issues within specified timeframes. You will also participate in design and technical meetings, conduct code reviews, automation, and perform thorough Unit/Integration Testing. Additionally, you will be responsible for writing technical documentation and release notes for the applications and providing technical guidance to junior programmers and other software engineers. To qualify for this role, you should hold a Bachelor's degree or equivalent in Computer Science or a related field and have 5-7 years of proven experience in software development and system maintenance. You should be experienced in developing elegant yet simple systems using best practices and design patterns. Proficiency in technologies such as .NET Core, C#, ASPNET REST & Web APIs, Angular, TypeScript, Webpack module loader, NPM, JSON/XML, LINQ, Entity Framework, IoC frameworks, CI/CD, and Redis is essential. Experience working with AWS technologies like Batch, Lambda, S3, and SQS is also required. An excellent understanding of object-oriented design concepts, software development processes, and methods is necessary, as is familiarity with CSS syntax, HTML5 specs, browser differences, and Bootstrap/Material CSS components. Previous experience in developing software in a SCRUM environment using Agile methodologies is preferred. It is important to have the ability to effectively work on multiple projects simultaneously, contend with competing priorities, possess strong troubleshooting, code optimization, and refactoring skills, and demonstrate a passion for development and latest technologies. The role also requires the ability to work independently with minimal supervision, learn and adapt to continuously changing technology, and provide technical guidance to junior programmers and software engineers. Nice-to-have qualifications include experience with frameworks like React and NextJS, UX patterns, CSS pre-compilers (SASS, LESS), multi-threaded programming in both procedural and functional paradigms, and client-side optimization techniques. Verisk has been a leading data analytics and technology partner to the global insurance industry for over 50 years, providing value through expertise and scale. As an employee, you will have the opportunity to contribute to a unique and rewarding career, with work flexibility, support, coaching, and training. Verisk is recognized as a Great Place to Work for its outstanding workplace culture and values learning, care, and results. In addition to workplace culture recognitions, Verisk has been acknowledged by The Wall Street Journal and Forbes for its management practices and as a top employer for women, underscoring its commitment to inclusivity and diversity. Verisk is looking for innovative individuals to help translate big data into big ideas, creating a better future for generations to come. Join us and be part of a team that relentlessly pursues innovation and ethical practices. As part of the Verisk team, you will have the opportunity to make a significant impact and shape a better tomorrow. Verisk Businesses: - Underwriting Solutions: Provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision - Claims Solutions: Supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and enhance customer experiences - Property Estimating Solutions: Offers property estimation software and tools for professionals to estimate all phases of building and repair efficiently - Extreme Event Solutions: Provides risk modeling solutions to enhance resilience to extreme events for individuals, businesses, and society - Specialty Business Solutions: Offers an integrated suite of software for managing insurance and reinsurance business efficiently - Marketing Solutions: Delivers data and insights to improve consumer engagement - Life Insurance Solutions: Provides core capabilities for carriers, distribution, and direct customers across the policy lifecycle of life and annuities - Verisk Maplecroft: Provides intelligence on sustainability, resilience, and ESG to strengthen individuals, businesses, and societies Verisk Analytics is an equal opportunity employer. Apply now for a rewarding and impactful career at Verisk: [Verisk Careers Page](https://www.verisk.com/company/careers/),
Posted 2 days ago
3.0 - 6.0 years
8 - 12 Lacs
Bengaluru
Work from Office
We are seeking an experienced, passionate, and motivated individual to join our team as a Service Relaibility Engineer. As part of this role, you will play a pivotal role in driving the reliability, availability, and performance of our web applications by implementing best practices for system design, software development, automated testing, and continuous integration using industry standard tools such as GitHub Actions, GitLab CI, or Jenkins. You will also be responsible for monitoring system health, troubleshooting incidents, and proactively identifying areas for improvement. About the Role In this opportunity as a Service ReliabilityEngineer, you will: Implement best practices for system design, software development, automated testing, and continuous integration using industry standard tools such as GitHub Actions, GitLab CI, or Jenkins. Collaborate closely with cross functional teams to understand their requirements and translate them into technical specifications for implementation. Monitor system health, troubleshoot incidents, and proactively identify areas for improvement. Develop automated tests and deployments utilizing programming languages such as Python, Bash, Groovy, or Ruby. Design scalable systems capable of handling high traffic volumes while maintaining optimal performance levels. Utilize cloud platforms like AWS, Azure, or Google Cloud Platform (GCP) to host applications efficiently. Work collaboratively within Agile teams to deliver new features & enhancements according to customer needs. Participate in code reviews to ensure adherence to coding standards, best practices, and security guidelines. About you: Youre a fit for the role of Service ReliabilityEngineer if you: Over 3 to 6 years of hands-on experience in AWS Cloud Strong proficiency with AWS tools, including EC2, S3, and VPC Proven track record of working with ServiceNow Extensive experience in application support Skilled in utilizing monitoring tools such as Datadog Open to working in rotational shifts Additional knowledge of Tomcat is a plus #LI-SS6 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 days ago
8.0 - 13.0 years
12 - 16 Lacs
Bengaluru
Work from Office
We are seeking a visionary Principal DevOps Engineer to lead the transformation of our DevOps strategydriving innovation, automation, and cloud excellence with a strong focus on AWS technologies. You will spearhead efforts to modernize CI/CD pipelines, enhance cloud security and observability, and lead a DevOps culture that accelerates business growth and technical excellence. Key Responsibilities Architect and evolve scalable, secure AWS infrastructure for performance and efficiency. Lead DevOps engineers to implement cloud-native, fully automated environments. Define and enforce IaC best practices using Terraform, CloudFormation, or CDK. Enable self-service deployments for development teams. Modernize CI/CD pipelines using Jenkins, GitHub Actions, AWS CodePipeline, etc. Champion serverless and event-driven architecture using AWS Lambda and Step Functions. Implement Kubernetes (EKS/ECS) and microservices infrastructure for agility and resilience. Enhance observability using CloudWatch, Prometheus, Grafana, and AI-based tools. Embed DevSecOps into CI/CD and cloud workflows, ensuring robust security posture. Lead disaster recovery, high availability, and business continuity planning. Optimize AWS spend using FinOps principles without compromising performance. Build and scale AI/ML pipelines using Spark, Kafka, TensorFlow, PyTorch, Kubeflow, MLflow, Airflow, and AWS SageMaker. Research and adopt emerging DevOps and cloud technologies to drive innovation. Required Qualifications 8+ years in DevOps, Cloud Infrastructure, or SRE with proven transformation leadership. 6+ years of AWS hands-on experience (EC2, S3, RDS, Lambda, VPC, IAM, etc.). Expertise in IaC with Terraform, CloudFormation, or CDK. Advanced scripting skills in Python, Bash, or Go. Mastery of Kubernetes (EKS or custom clusters), Docker, and microservices architecture. Extensive CI/CD experience with Jenkins, GitHub Actions, GitLab CI/CD, AWS tools. Deep knowledge of observability tools (CloudWatch, Prometheus, Grafana, ELK). Strong cloud security, IAM policies, and compliance understanding. Proven leadership in DevOps cultural and technological transformation. Hands-on AI/ML pipeline experience using Spark, Kafka, TensorFlow, Kubeflow, SageMaker, etc. Preferred Qualifications AWS Certified DevOps Engineer Professional or AWS Solutions Architect Professional. Experience building large-scale serverless systems. Strong FinOps and AWS cost management knowledge. Experience with SRE practices and site reliability principles. Familiarity with AI-driven automation and self-healing infrastructure approaches.
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
This is an exciting opportunity for you to steer your career in a new direction and lead multiple teams to success at a premier financial institution. As the Manager of Software Engineering at JPMorgan Chase in the Consumer and Community Banking sector, you will be responsible for overseeing multiple teams and coordinating day-to-day implementation activities. Your role will involve identifying and escalating issues, ensuring that your teams" work aligns with compliance standards, business requirements, and tactical best practices. You will provide guidance to your team of software engineers on their daily tasks and activities, setting expectations for their output, practices, and collaborative efforts. Anticipating dependencies with other teams to meet business requirements will be crucial, along with managing stakeholder relationships in compliance with standards and business needs. Fostering a culture of diversity, equity, and inclusion within your team and prioritizing diverse representation will also be key aspects of your role. The ideal candidate should have formal training or certification in AWS, Kafka, Java, J2EE concepts, with at least 5 years of applied experience. Experience in leading technology projects, managing technologists, proficiency in automation and continuous delivery methods, and a strong understanding of the Software Development Life Cycle are essential. An advanced knowledge of agile methodologies, financial services industry IT systems, system design, analysis, development, people management, and exposure to Machine Learning and Artificial Intelligence is required. Additionally, practical experience in AWS technologies like MSK, EKS, ECS, S3, Dynamo, and cloud native applications will be advantageous. Preferred qualifications include hands-on experience working at the code level, a background in Computer Science, Engineering, Mathematics, or related fields, and expertise in various technology disciplines. If you are ready to take on this challenging and rewarding role, apply now and be a part of shaping the future of technology in the financial sector.,
Posted 3 days ago
5.0 - 12.0 years
0 Lacs
maharashtra
On-site
The Senior Quality Engineer will play a crucial role in representing the Albert product and brand with dedication. You will collaborate effectively with the team to ensure the timely delivery of a high-quality product. Supporting the Engineering team in maintaining the quality of deliverables will be a key responsibility. Additionally, you will be instrumental in enhancing automation and setting standards within the quality engineering domain. To excel in this role, you should possess at least 5 years of experience in quality engineering, particularly in creating automated tests for large-scale distributed enterprise systems. Proficiency in utilizing tools such as Mocha, Chai, and cucumber for writing BDDs is essential. A solid understanding of performance testing tools like k6 and API development methodologies, including reading specs like open API (swagger) & graphQl, is required. Furthermore, familiarity with agile development practices, including tools like GitHub, Jira, and CircleCI, will be advantageous. Knowledge of AWS technologies will also be beneficial for this role.,
Posted 6 days ago
4.0 - 9.0 years
7 - 11 Lacs
Pune, Bengaluru
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job TitlePython QA Developer LocationBangalore/ Pune Work ModeHybrid (3 days WFO - Tues, Wed, Thurs) Shift Time12:30PM - 9:30PM IST Looking for a 4+ years of experienced Python with QA/Test Developer who worked in Python Web development projects with below skillset. Mandatory 1. Should be good with Python based Web project development using VS Code IDE 2. Good at Python language syntax and basic Unit testing libraries like PyTest or UnitTest. 3. Should be good at writing Unit tests, Test plans for Web based Applications involving Server-side Python API Services 4. Good at Writing Unit/Integration test cases. 5. Should be good at Functional and Component testing scenarios. 6. Good at debugging the python code and able to fix any bugs within the Python code. 7. Good understanding at QA life cycle and work with Developer team shoulder-to-shoulder. 8. Experience in using Jira tool for Test case management and working with Agile methodologies like Scrum/Kanban. 9. Good at Analytical and logical reasoning skills Desirable: A. Know the performance testing techniques in the Python code and able to fix performance related issues in the Web based projects. B. Having knowledge on AWS deployment test environments on EC2 for Python based web projects. C. Highly desirable to have working with VS Code with AWS tools plugin Test plugins for Python. D. Recommended to have knowledge on Test automation concepts using Robot/Selenium framework. E. Know CI/CD concepts. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visitwww.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 1 week ago
7.0 - 12.0 years
12 - 17 Lacs
Gurugram, Bengaluru
Work from Office
Key Responsibilities Automate deployments utilizing custom templates and modules for customer environments on AWS. Architect AWS environment best practices and deployment methodologies. Create automation tools and processes to improve day to day functions. Educate customers on AWS and Rackspace best practices and architecture. Ensure the control, integrity, and accessibility of the cloud environment for the enterprise Lead Workload/Workforce Management and Optimization related tasks. Mentor and assist Rackers across the Cloud Function. Quality check development of technical training for all Rackers supporting Rackspace Supported CLOUD Products. Provide technical expertise underpinning communications targeting a range of stakeholders - from individual contributors to leaders across the business. Collaborate with Account Managers and Business Development Consultants to build strong customer relationships. Technical Expertise Experienced in solutioning and implementation of Green field projects leveraging IaaS, PaaS for Primary site and DR. Near expert knowledge of AWS Products & Services, Compute, Storage, Security, networking, etc. Proficient skills in at least one of the following languages: Python, Linux, Shell scripting. Proficient skills with git and git workflows. Excellent working knowledge of Windows or Linux operating systems experience of supporting and troubleshooting issues and performance. Highly skilled in Terraform/IaC, including CI/CD practices. Working knowledge in Kubernetes. Experience in designing, building, implementing, analysing, Migrating and troubleshooting highly available systems. Knowledge of at least one configuration management system such as Chef, Ansible, Puppet or any other such tools. Understanding of services and protocols, configuration, management, and troubleshooting of hosting environments, including web servers, databases, caching, and database services. Knowledge in the application of current and emerging network software and hardware technology and protocols. Skills Passionate about technology and has a desire to constantly expand technical knowledge. Detail-oriented in documenting information and able to own customer issues through resolution. Able to handle multiple tasks and prioritize work under pressure. Demonstrate sound problem-solving skills coupled with a desire to take on responsibility. Strong written and verbal communication skills, both highly technical and non-technical. Ability to communicate technical issues to nontechnical and technical audiences. Education Required Bachelors degree in Computer Science or equivalent degree. Certifications Requires all 3 Associate level Certificates in AWS or professional level certificate. Experience 7+ Years of total IT experience
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Noida, Hyderabad, India
Hybrid
Essential Functions: Executes different phases of solution deployment including establishing connectivity, data onboarding, configuration, testing and go-live preparation activities using in-house tools and third-party applications Adheres to pre-determined service level agreements and drives timely and quality execution of interoperability and database management solutions resulting in a delightful client experience Proactively monitors and addresses data connectivity issues Balances assigned tasks: Client implementation (30%), Client Support (70%) Works with internal stakeholders in Implementation, Client Success and Client Support teams (as a Tier 2 function) Focus on client success with effective verbal and written communication skills Actively participates in an agile environment Performs other related duties as assigned Supervisory Requirements: Individual Contributor Standard Expectations: Complies with organizational policies, procedures, and performance improvement initiatives and maintains organizational and industry policies regarding confidentiality Communicate clearly and effectively in English in both written and verbal forms Develops constructive and cooperative working relationships with others and maintains them over time Encourages and builds mutual trust, respect, and cooperation among team members Maintains regular and predictable attendance Education & Experience Requirements: Bachelor’s Degree in Computer Science, Computer Engineering, or Management Information Systems or equivalent work experience in lieu of degree is required Related experience including: 3+ years of experience in extract, transfer, load, integration role (required) 2+ years of experience with AWS technologies (required) 1+ years of experience in managing large-scale internal, external/client-facing projects (required) 1+ years of experience with one of the following: PERL, Python, or any object-oriented programming languages (required) 1+ years of experience using Postman API collections (required) 2+ years of experience with API, HL7-based integrations (preferred) Strong background and previous experience using SQL or other RDBMS type environment (required) Familiarity with utilizing APIs (required) Understanding of GIT Repository functionality (preferred) Familiarity with a Linux Environment (required) Hands-on experience leveraging third-party integration engine tools, including Rhapsody (preferred) Roles and Responsibilities Essential Functions: Executes different phases of solution deployment including establishing connectivity, data onboarding, configuration, testing and go-live preparation activities using in-house tools and third-party applications Adheres to pre-determined service level agreements and drives timely and quality execution of interoperability and database management solutions resulting in a delightful client experience Proactively monitors and addresses data connectivity issues Balances assigned tasks: Client implementation (30%), Client Support (70%) Works with internal stakeholders in Implementation, Client Success and Client Support teams (as a Tier 2 function) Focus on client success with effective verbal and written communication skills Actively participates in an agile environment Performs other related duties as assigned Supervisory Requirements: Individual Contributor Standard Expectations: Complies with organizational policies, procedures, and performance improvement initiatives and maintains organizational and industry policies regarding confidentiality Communicate clearly and effectively in English in both written and verbal forms Develops constructive and cooperative working relationships with others and maintains them over time Encourages and builds mutual trust, respect, and cooperation among team members Maintains regular and predictable attendance Education & Experience Requirements: Bachelor’s Degree in Computer Science, Computer Engineering, or Management Information Systems or equivalent work experience in lieu of degree is required Related experience including: 3+ years of experience in extract, transfer, load, integration role (required) 2+ years of experience with AWS technologies (required) 1+ years of experience in managing large-scale internal, external/client-facing projects (required) 1+ years of experience with one of the following: PERL, Python, or any object-oriented programming languages (required) 1+ years of experience using Postman API collections (required) 2+ years of experience with API, HL7-based integrations (preferred) Strong background and previous experience using SQL or other RDBMS type environment (required) Familiarity with utilizing APIs (required) Understanding of GIT Repository functionality (preferred) Familiarity with a Linux Environment (required) Hands-on experience leveraging third-party integration engine tools, including Rhapsody (preferred)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 5 years of professional software testing experience, with a focus on test automation, test planning, test case design, and test methodologies. It is essential to have extensive hands-on experience with Selenium WebDriver using Java, Maven, and the TestNG framework. Experience in maintaining testing artefacts and a solid understanding of QA testing tools and environments are also required. Knowledge of Junit and Cucumber is a plus, as well as experience working with AWS technologies. In this role, you will be responsible for building, testing, and maintaining a test automation framework and test scripts to ensure repeatability, coverage, and reliability. Experience with Page Object Model (POM) design, JIRA, Git, GitHub, SDLC, and Agile/Scrum methodologies is necessary. Familiarity with Integration testing, Unit Testing, BDD, functional and regression testing is also expected. Effective communication skills, both verbal and written, are important for this position. Experience in Web Services API testing, both manual using Postman and automated using REST API testing frameworks, would be beneficial. Strong coding skills and a good understanding of Java programming language are required. Experience in database testing and the ability to understand basic to advanced SQL concepts are essential. To excel in this role, you should possess a strong sense of self-motivation, organization, and attention to detail.,
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description : As a Senior Software Engineer - AWS Python at Incedo, you will be responsible for developing and maintaining applications on the Amazon Web Services (AWS) platform. You will be expected to have a strong understanding of Python and AWS technologies, including EC2, S3, RDS, and Lambda. Roles & Responsibilities: Writing high quality code, participating in code reviews, designing systems of varying complexity and scope, and creating high quality documents substantiating the architecture. Engaging with clients, understanding their technical requirements, planning and liaising with other team members to develop technical design & approach to deliver end-to-end solutions. Mentor & guide junior team members, review their code, establish quality gates, build & deploy code using CI/CD pipelines, apply secure coding practices, adopt unit-testing frameworks, provide better coverage, etc. Responsible for teams growth. Technical Skills : Must Have : Python, FAST API, Uvicorn,SQLAlchemy,boto3,Lamdba server less, pymysql Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Proficiency in Python programming language Experience in developing and deploying applications on AWS Knowledge of serverless computing and AWS Lambda Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The Senior Specialist Test Expert in the Data & Analytics Team at Novartis, located in Hyderabad, plays a crucial role in ensuring the quality and reliability of multiple products. Your responsibilities include validating new products, existing product enhancements, defect fixes, and recommending improvements to enhance the overall user experience. You will collaborate with AWS Analytics Platform team members to define overall test strategies for various products. It will be your responsibility to drive the standard execution of multiple test phases such as system, integration, functional, regression, acceptance, and performance testing. Ensuring all test-related deliverables are of high quality in line with defined GxP and Non-GxP processes is essential. Regular collaboration with eCompliance, Quality Managers, and Security Architects/Managers will be required for reviewing different artifacts for final sign-off. Your role will involve enabling automation with approved tools or in-house developed code-based frameworks as necessary. You will expand the testing scope by including boundary cases, negative cases, edge/corner cases, and integrating testing with the DevOps pipeline using tools like Bitbucket, Jenkins, and Artifactory for seamless test and deployment to higher environments. Additionally, working closely with resources from partner organizations for mentoring, delegation of work, and timely delivery is crucial. To be eligible for this role, you should possess a Bachelor's/Master's degree in Computer Science or a related field and have at least 8 years of experience in quality engineering/testing. Strong expertise in manual and automation testing, particularly with tools like Selenium, Postman, and Jmeter, is required. Excellent knowledge of programming languages such as JAVA and familiarity with Python is preferred. Experience with a broad range of AWS technologies and DevOps tools like Bitbucket, GIT, Jenkins, and Artifactory is necessary. Proficiency in JIRA and Confluence, along with excellent verbal and written communication skills in a global environment, is a must. An open and learning mindset, adaptability to new learning approaches and methods, and the ability to work as part of an Agile development team are also essential. Novartis is dedicated to reimagining medicine to enhance and extend people's lives with a vision to become the most valued and trusted medicines company globally. By joining Novartis, you can be part of a mission-driven organization where associates are empowered to drive ambitions forward. If you are passionate about making a difference in the world of healthcare, consider joining our diverse and inclusive team at Novartis. For more information on Novartis and our commitment to diversity and inclusion, visit https://www.novartis.com/about/strategy/people-and-culture. Additionally, to explore career opportunities and stay connected with Novartis, join the Novartis Network at https://talentnetwork.novartis.com/network. Novartis offers a supportive work environment that values diversity and inclusion, empowering teams to make a positive impact on patients and communities. If you are ready to collaborate, support, and inspire breakthroughs that change lives, consider joining our dedicated team at Novartis. To learn about the benefits and rewards of working at Novartis, read our handbook at https://www.novartis.com/careers/benefits-rewards. If this role does not align with your current career goals but you wish to explore future opportunities at Novartis, sign up for our talent community at https://talentnetwork.novartis.com/network.,
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru, Karnataka
Work from Office
Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 2 weeks ago
2.0 - 5.0 years
30 - 32 Lacs
Bengaluru
Work from Office
Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are a highly skilled Software Engineer with expertise in full stack development, including Python, Django or FastAPI, database management, API development, JavaScript (React), Typescript, and HTML. Your role at Nasuni involves developing and maintaining hybrid cloud enterprise software for remote file access and collaboration for global customers. Responsibilities: - Design, implement, and test new features - Conduct performance testing of backend API services to ensure scalability and user experience - Maintain and enhance existing software components - Respond to customer incidents, perform root cause analysis, and implement preventative measures - Work with AWS technologies such as EC2, Aurora, Elasticache, API Gateway, and Lambda - Collaborate with UI/UX/Product and system test engineers to deliver goals - Follow development processes and quality guidelines Technical Skills Required: - Strong knowledge in Front-end or Backend code development - Proficiency in Python 3, Python FastAPI, JavaScript (React), Typescript - Good knowledge of Linux, Git (GitHub), Docker, Jenkins, Postgres or MySQL databases - Understanding of CI/CD pipeline building - Exposure to cloud services, especially AWS - Problem-solving and troubleshooting skills - Experience in agile development environments Experience: - BE/B.Tech, ME/M.Tech in computer science or related fields - 3 to 6 years of industry experience, with at least 2+ years in full-stack development Why Work at Nasuni Hyderabad Nasuni offers competitive benefits including: - Competitive compensation programs - Flexible time off and leave policies - Comprehensive health and wellness coverage - Hybrid and flexible work arrangements - Employee referral and recognition programs - Professional development and learning support - Inclusive, collaborative team culture - Modern office spaces with team events and perks - Retirement and statutory benefits as per Indian regulations Note: Nasuni does not accept agency resumes. Please do not forward resumes to our job boards or employees. Nasuni is not responsible for any fees related to unsolicited resumes.,
Posted 2 weeks ago
3.0 - 5.0 years
30 - 32 Lacs
India, Bengaluru
Work from Office
Job Title : Data Engineer (DE) / SDE Data Location Bangalore Experience range 3-15 What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 2 weeks ago
12.0 - 17.0 years
40 - 45 Lacs
Noida, India
Work from Office
AWS Solution architect is a customer facing role partnering with sales and delivery team to engage with customer throughout the sales process to close the new business A very good knowledge on Cloud application migrations (Re-platform, Re-factor and Re-write) is required with deep knowledge on AWS services to act as a trusted advisor to customer and delivery team win their confidence and close the new business This role required dual combination of Technology and Solution skills Technology skills to educate the customer on best practices, risks, challenges, features of cloud to win their confidence and solution skills to understand the requirements of customer, their expectations, business drivers to put together the right solution This role will also work very closely with the delivery team to train them and guide them on the best practices of cloud delivery Providing governance by looking at key delivery metrics and advising the delivery leaders for early indicators of potential issues is also required Success criteria for this role will be a measure of incremental business generated with existing and new customers by jointly working with sales and delivery team and competency creation Essential Duties and Responsibilities Responsible in delivering Cloud native applications or migrations at scale for large customers Responsible in working very closely with the delivery team to train them and guide them on the best practices of cloud delivery Providing governance by looking at key delivery metrics and advising the delivery leaders for early indicators of potential issues Handle objections and questions that come up during sales process Handle typical challenges in cloud adoption around compliance, regulations, performance Understand cloud design patterns and their pros and cons Candidate Profile : AWS Architect with 12+ years experience in designing, deploying, and implementing Cloud Technologies The candidate should have excellent knowledge of AWS tools used in migration and development Hands on knowledge on at least one programming language like Java, Net or Python Good knowledge of AWS tools used in migration and development Experience with application modernization to cloud experience with at least one full lifecycle The candidate must be an AWS certified architect Hands on experience on cloud design patterns, frameworks, technologies used Focused on estimated approach and methodologies Candidate must possess deep expertise in value proposition articulation and messaging Nice To Have: BFSI domain expertise would be an added advantage Exposure to working with clients in Northern American region would be an added advantage
Posted 2 weeks ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, designing, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Lead technical processes and designs considering reliability, data integrity, maintainability, reuse, extensibility, usability, and scalability. Review code of development team to ensure quality and adherence to best practices and standards. Mentor junior developers to develop their skills and build strong talent. Collaborate with Infrastructure partners to identify and deploy optimal hosting environments. Define scalability and performance criteria for assigned applications. Ensure application meets the performance, privacy, and security requirements. Verify test plans to ensure compliance with performance and security requirements. Support business and technical presentations in relation to technology platforms and business solutions. Mitigate risk by following established procedures and monitoring controls. Help develop solutions that balance cost and delivery while meeting business requirements. implement technology-specific best practices that are consistent with corporate standards. Partner with multi-functional teams to ensure the success of product strategy and project deliverables. Manage the software development process. Drive new technical and business process improvements. Estimate total costs of modules/projects covering both hours and expense. Research and evaluate specific technologies, and applications, and contributes to the solution design. Construct application Architecture encompassing end-to-end designs. Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Qualifications: Minimum of 7+ years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 7+ years Strong Frontend Experience - jQuery and JavaScript 7+ years of Active Development Experience/ Expertise in Java/J2EE Based Applications proven ability with Hibernate, Spring, Spring MVC Experience in Web based UI development Experience with CSS, HTML, JavaScript, and similar UI frameworks (jQuery, React) Familiarity with Microservices based architecture and distributed systems Hands on experience with AI tools such as Amazon Q is a plus Ability to develop and work with REST APIs using Spring Boot framework. Hands-on experience with AWS technologies, Snow Flake is a plus Strong database and PL/SQL skills (Oracle, Postgres preferred) Experience with Messaging, ETL or Reporting tools is a plus. Knowledge of Python a plus Familiarity with Agile development methodology Collaborate with multiple collaborators such as product management, application development, DevOps, and other technical groups.
Posted 2 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 weeks ago
9.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 weeks ago
15.0 - 19.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Architect - AVP, you will be instrumental in defining and executing our AWS cloud strategy to ensure the effective deployment and administration of AWS cloud solutions. Your role will involve leading a team of AWS cloud engineers and architects, collaborating with diverse stakeholders, and utilizing your extensive expertise to promote AWS cloud adoption and innovation throughout the organization. Your primary responsibilities will include formulating and executing the company's AWS cloud strategy in alignment with business objectives, overseeing the design, architecture, and deployment of AWS cloud solutions with a focus on scalability, security, and reliability, collaborating with various teams to seamlessly integrate AWS services, evaluating and selecting appropriate AWS services and technologies, managing the migration of on-premises applications and infrastructure to AWS, establishing and enforcing AWS cloud governance, security policies, and best practices, providing technical leadership and guidance to the AWS cloud team to promote innovation and continuous enhancement, staying abreast of the latest AWS technologies and industry trends to incorporate relevant advancements into the AWS cloud strategy, and effectively communicating AWS cloud strategy, progress, and challenges to senior leadership and stakeholders. To qualify for this role, you should possess a Bachelor's or Master's degree in computer science, Information Technology, or a related field, along with a minimum of 15 years of IT experience, with at least 10 years dedicated to cloud architecture and implementation, particularly with AWS. Additionally, you should have experience with AWS cloud services SOC 2, ITIL, PCI-DSS, SAE16, ISO27001, Cobit, and/or HiTrust, cloud-native architectures, leading large-scale AWS cloud transformation projects, AWS cloud security, governance, and compliance, infrastructure as code (IaC) and automation tools such as AWS CloudFormation and Terraform, networking, storage, databases, and application development in AWS, exceptional problem-solving abilities, innovative design skills for AWS cloud solutions, strong leadership and communication capabilities, and a track record of managing and mentoring teams effectively. Preferred qualifications include being an AWS Certified Solutions Architect - Professional, experience with multi-cloud and hybrid cloud environments, familiarity with DevOps practices and tools like AWS CodePipeline and Jenkins, and knowledge of emerging technologies such as AI, ML, and IoT in relation to AWS cloud computing.,
Posted 2 weeks ago
8.0 - 17.0 years
0 Lacs
maharashtra
On-site
As the Head of Engineering at Enago, you will be leading a team of talented web developers to ensure high-quality end-to-end delivery of AI-powered tools and services aimed at boosting the productivity of researchers and professionals. You will work closely with various technical roles within the organization, such as the Director Engineer, Technical Project Manager, Solution Architect, Principal Engineers, and Senior DevOps, to maintain a flexible and innovative product while keeping technical debt at bay. Your primary responsibilities will include reviewing solution architecture, ensuring best practices in the engineering development lifecycle, and evaluating the performance of key technical team members. The ideal candidate for this role should possess a minimum of 8 years of enterprise backend or full-stack web development experience, with expertise in technologies such as VueJS, AngularJS, NodeJS, Java, Python Django, and AWS Serverless. Additionally, a strong background in solution architecture (10+ years) and engineering management (6+ years) is essential. You should excel in understanding business goals, implementing test-driven development practices, and designing optimized scalable solutions. Your ability to break down complex problems into manageable tasks, conduct code reviews, estimate project efforts accurately, and communicate effectively within the team will be crucial for success in this role. Moreover, you should have a proven track record of technical leadership, solution architecting, and robust development experience with a focus on backend technologies, database management, AWS services, and developer tooling. Your familiarity with HTML5, CSS3, CSS processors, and CSS frameworks, along with a deep understanding of testing, monitoring, and observability practices, will be highly valued. Experience with Elasticsearch server cluster optimization and Apache Spark/Ray will be considered an advantage. In summary, the role of Head of Engineering at Enago offers a unique opportunity to lead a talented team in revolutionizing research-intensive projects through innovative AI-powered solutions. If you are passionate about leveraging technology to make a positive impact on the world and possess the required technical expertise and leadership skills, we encourage you to apply and be a part of our mission to enhance knowledge discovery, creation, and dissemination through cutting-edge AI technologies. For more information about our products and company, please visit our websites: - Trinka: http://www.trinka.ai - RAx: http://raxter.io - Enago: http://www.enago.com,
Posted 2 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 3 weeks ago
8.0 - 13.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : MERN StackReact & NodeAWSDesigning Skills Preferred Skills: Technology-Cloud Platform-AWS Database-AWS Technology-Reactive Programming-NodeJS Technology-Reactive Programming-react JS Technology-Full stack-MERN stack
Posted 3 weeks ago
4.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Python+ Lamda+ AWS+Airflow Preferred Skills: Technology-Cloud Platform-AWS Database-AWS Technology-OpenSystem-Python - OpenSystem-Python Technology-Machine Learning-Python
Posted 3 weeks ago
8.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
At Freedom Mortgage Corporation, we are working with leading technologies that include Java/J2EE, Spring/Spring Boot and AWS Technologies such as Elastic Beanstalk/Lambda/S3/SQS etc to help transform our business We are searching for Software Engineers to join our progressive information technology team This is an opportunity to develop your skills in leading edge technologies Freedom Mortgage Corporation is one of Americas largest privately held mortgage companies whose mission is to help enable our customers to realize the American dream of home ownership If you are a person who enjoys the excitement of the financial services industry where information IS our business and you enjoy developing software solutions, then this position is for you Essential Duties and Responsibilities: Primary responsibilities are focused on the Design, Development, and Testing of Spring/Spring Boot and AWS Elastic Beanstalk/Lambda/S3 based solutions. Utilize your collaborative skills to work with business partners, IT managers/staff, to ensure high reliability, availability, and performance of applications. Improve platform to take on additional load, address recommendations on fixes or backlog. Leverage your analytical skills to create technical solutions for business processes. Required Competencies: Strong written and verbal communication skills 8+ years experience with end-to-end design of JAVA/J2EE applications 8+ years experience with Spring/Spring Boot, XML, Hibernate, JSP and SOAP/REST/Micro Services 4+ years experience with React 1+ years experience developing applications using AWS technologies such as S3, SQS, EC2, Kubernetes, Docker and Lambda functions Strong work experience with developing and deploying containerized applications. (Docker, ECR, ECS and Fargate). Experience in HTML, CSS & JS is a plus Experience with Swagger or similar tools for API import into AWS API Gateway Experience with Cloudwatch is a plus Experience working with relational data bases such Oracle Proficiency in SQL Experience with MicroService Framework and Design patterns Enjoy contributing in a team environment Possess through your technical contributions Extensive technical experience in designing and developing complex applications, and managing the design life cycle in an Internet enterprise environmenta positive, can-do attitude and enjoy making a difference in the business Desired Competencies: Cloudtrail experience Experience with Rules Based engines such as JRules or Drools is a plus Experience with Mulesoft Platform is a plus Mortgage Industry Experience Educational Requirements: Position requires a Bachelor of Science or Bachelor of Arts degree in Computer Technology, Informatics, Computer Science, Engineering or Business or equivalent experience.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough