Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
7 - 10 Lacs
Noida
Work from Office
Full-stack developer with 5 8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles . Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in microservices architecture and implementation Strong knowledge of HTML, CSS and JavaScript, React Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, DynamoDB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., Git) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc . Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, Git, SonarQube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java Full Stack - HTML/CSS Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Programming Language - Java - Spring Framework Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Programming Language - Java Full Stack - JavaScript DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Database - Oracle - PL/SQL Packages Database - Sql Server - SQL Packages Development Tools and Management - Development Tools and Management - CI/CD User Interface - Other User Interfaces - React Programming Language - Java Full Stack - Spring Framework Middleware - Java Middleware - Springboot Middleware - API Middleware - Microservices Middleware - API Middleware - WebServies (REST, SOAP) Middleware - API Middleware - API (SOAP, REST) Agile - Agile - SCRUM
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You have hands-on experience in AWS Cloud Java development and are an expert in implementing AWS services like EC2, VPC, S3, Lambda, Route53, RDS, Dynamo DB, ELB/ALB/NLB, ECS, SNS, SQS, CloudWatch, API Gateway etc. You also have knowledge on EFS / S3 for File storage and Cognito for authorization. Additionally, you have strong knowledge of Containerization and have worked on AWS ECS/ECR. You are proficient in inter-service communication through REST, gRPC, or messaging (SQS, Kafka). You have knowledge of writing Unit Test cases with JUNIT and strong notions of security best practices. Your expertise extends to AWS CDK and CDK Pipelines for IaC. You are capable of implementing service discovery, load balancing, and circuit breaker patterns. You have an understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services. Experience with CI/CD tools, DevOps implementation, and HA/DR setup is part of your skill set. You possess excellent communication and collaboration skills. Your responsibilities include hands-on experience with technologies like Java, Spring Boot, Rest API, JPA, Kubernetes, Messaging Systems, Tomcat/JBoss. You develop and maintain microservices using Java, Spring Boot, and Spring Cloud. You design RESTful APIs with clear contracts and efficient data exchange. Ensuring cloud security and compliance with industry standards is part of your routine. You maintain cloud infrastructure using AWS Cloud Development Kit (CDK) and implement security best practices, including data encryption and adherence to security protocols. Qualifications required for this role include 6+ years of hands-on experience in AWS Cloud Java development, expertise in implementing AWS services, strong knowledge of Containerization, and inter-service communication skills. You must have a solid understanding of security best practices, knowledge on File storage and authorization, and writing Unit Test cases. Familiarity with serverless approaches using AWS Lambda is also essential. Nice to have qualifications include proven expertise in AWS CDK and CDK Pipelines, implementing service discovery, load balancing, and circuit breaker patterns, familiarity with logging and monitoring services, experience with CI/CD tools, DevOps implementation, and HA/DR setup. Excellent communication and collaboration skills are a bonus to work effectively in a team-oriented environment. About Virtusa: Virtusa embodies values such as teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team of 27,000 people who care about your growth. You will have the opportunity to work on exciting projects, utilize state-of-the-art technologies, and advance your career with us. At Virtusa, great minds come together to nurture new ideas and foster excellence in a collaborative team environment.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As an Integration Technical Specialist at Nasdaq Technology in Bangalore, India, you will be a key member of the Enterprise Solutions team. Nasdaq is a dynamic organization that constantly adapts to market changes and embraces new technologies to develop innovative solutions, aiming to shape the future of financial markets. In this role, you will be involved in delivering complex technical systems to customers, exploring new technologies in the FinTech industry, and driving central initiatives across Nasdaq's technology portfolio. Your responsibilities will include collaborating with global teams to deliver solutions and services, interacting with internal customers, designing integrations with internal and third-party systems, performing end-to-end testing, participating in the software development process, and ensuring the quality of your work. You will work closely with experienced team members in Bangalore and collaborate with Nasdaq teams in other countries. To be successful in this role, you should have 10 to 13 years of integration development experience, expertise in web services like REST and SOAP API programming, familiarity with Informatica Cloud and ETL processes, a strong understanding of AWS services such as S3, Lambda, and Glue, and a Bachelor's or Master's degree in computer science or a related field. Additionally, proficiency in Workday Integration tools, knowledge of finance organization processes, and experience in multinational companies are desirable. At Nasdaq, you will be part of a vibrant and entrepreneurial environment that encourages initiative, challenges the status quo, and values authenticity. The company promotes a culture of connection, support, and empowerment, with a hybrid work model that prioritizes work-life balance and well-being. Benefits include an annual bonus, stock ownership opportunities, health insurance, a flexible working schedule, a mentorship program, and access to online learning resources. If you are a passionate professional with a drive to deliver top technology solutions and thrive in a collaborative, innovative environment, we encourage you to apply in English at your earliest convenience. Nasdaq is committed to providing reasonable accommodations for individuals with disabilities during the application and interview process. Come as you are and join us in shaping the future of financial markets at Nasdaq.,
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
navi mumbai, maharashtra
On-site
Seekify Global is looking for an experienced and motivated Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a significant background in designing and implementing metadata and data catalog solutions within AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer at Seekify Global, you will play a crucial role in improving data discoverability, governance, and lineage across our enterprise data assets. Your responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for structured and unstructured data assets, and integrating the data catalog with various AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. You will collaborate closely with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Additionally, you will be responsible for developing automation scripts for catalog ingestion, lineage tracking, and metadata updates using tools like Python, Lambda, Pyspark, or Glue/EMR custom jobs. Working in coordination with data engineers, data architects, and analysts, you will ensure that metadata is accurate, relevant, and up to date. Implementing role-based access controls and ensuring compliance with data privacy and regulatory standards will also be part of your role. Moreover, you will be expected to create detailed documentation and conduct training/workshops for internal stakeholders on effectively utilizing the data catalog. **Key Responsibilities:** - Lead end-to-end implementation of a data cataloging solution within AWS, preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc. - Establish and manage metadata frameworks for structured and unstructured data assets in data lake and data warehouse environments. - Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. - Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. - Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark, or Glue/EMR custom jobs. - Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. - Implement role-based access controls and ensure compliance with data privacy and regulatory standards. **Required Skills and Qualifications:** - 7-8 years of experience in data engineering or metadata management roles. - Proven expertise in implementing and managing data catalog solutions within AWS environments. - Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation. - Hands-on experience with metadata ingestion, data lineage, and classification processes. - Proficiency in Python, SQL, and automation scripting for metadata pipelines. - Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). - Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. - Strong communication, problem-solving, and stakeholder management skills. **Preferred Qualifications:** - AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). - Experience with data catalog tools like Alation, Collibra, or Informatica EDC, or open-source tools hands-on experience. - Exposure to data quality frameworks and stewardship practices. - Knowledge of data migration with data catalog and data-mart is a plus. This is a full-time position with the work location being in person.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Data Engineer2 at GoKwik, you will have the opportunity to closely collaborate with product managers, data scientists, business intelligence teams, and SDEs to develop and implement data-driven strategies. Your role will involve identifying, designing, and executing process improvements to enhance data models, architectures, pipelines, and applications. You will play a vital role in continuously optimizing data processes, overseeing data management, governance, security, and analysis to ensure data quality and security across all product verticals. Additionally, you will design, create, and deploy new data models and pipelines as necessary to achieve high performance, operational excellence, accuracy, and reliability in the system. Your responsibilities will include utilizing tools and technologies to establish a data architecture that supports new data initiatives and next-gen products. You will focus on building test-driven products and pipelines that are easily maintainable and reusable. Furthermore, you will design and construct an infrastructure for data extraction, transformation, and loading from various data sources, supporting the marketing and sales team. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Mathematics, or relevant computer programming training, along with a minimum of 4 years of experience in the Data Engineering field. Proficiency in SQL, relational databases, query authoring, data pipelines, architectures, and working with cross-functional teams in a dynamic environment is essential. Experience with Python, data pipeline tools, and AWS cloud services is also required. We are looking for individuals who are independent, resourceful, analytical, and adept at problem-solving. The ability to adapt to changing environments, excellent communication skills, and a collaborative mindset are crucial for success in this role. If you are passionate about tackling challenging problems at scale and making a significant impact within a dynamic and entrepreneurial setting, we welcome you to join our team at GoKwik.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. As a Senior Backend Software Engineer, your job responsibilities will include: - Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. - Develop high-quality, production-ready code that millions of users of our cloud platform can use. - Design, implement, and tune robust APIs and API framework-related features that perform and scale in a multi-tenant environment. - Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. - Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment. - Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level). Required Skills: - Mastery of multiple programming languages and platforms. - 5+ years of backend software development experience including designing and developing distributed systems at scale. - Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS, and C++. - Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. - A deeper understanding of software development best practices and demonstrate leadership skills. - Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.). Preferred Skills: - Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. - Experience with Big-Data/ML and S3. - Hands-on experience with Streaming technologies like Kafka. - Experience with Elastic Search. - Experience with Terraform, Kubernetes, Docker. - Experience working in a high-paced and rapidly growing multinational organization. Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! - World-class enablement and on-demand training with Trailhead.com. - Exposure to executive thought leaders and regular 1:1 coaching with leadership. - Volunteer opportunities and participation in our 1:1:1 model for giving back to the community. For more details, visit https://www.salesforcebenefits.com/,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer at Barclays, where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Data Engineer, you should have experience with hands-on experience in Pyspark and a strong knowledge of Dataframes, RDD, and SparkSQL. You should also have hands-on experience in developing, testing, and maintaining applications on AWS Cloud. A strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) is essential. Additionally, you should be able to design and implement scalable and efficient data transformation/storage solutions using Snowflake. Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc., is required. Familiarity with using DBT (Data Build Tool) with Snowflake for ELT pipeline development is necessary. Advanced SQL and PL SQL programming skills are a must. Experience in building reusable components using Snowflake and AWS Tools/Technology is highly valued. Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. Knowledge of Orchestration tools such as Apache Airflow or Snowflake Tasks is beneficial, and familiarity with Abinitio ETL tool is a plus. Some other highly valued skills may include the ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. A good understanding of infrastructure setup and the ability to provide solutions either individually or working with teams is essential. Knowledge of Data Marts and Data Warehousing concepts, along with good analytical and interpersonal skills, is required. Implementing Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy is also important. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Meet the needs of stakeholders/customers through specialist advice and support. - Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. - Likely to have responsibility for specific processes within a team. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - Manage own workload, take responsibility for the implementation of systems and processes within own work area and participate in projects broader than the direct team. - Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. - Provide specialist advice and support pertaining to own work area. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how all teams in the area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. - Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative/operational expertise. - Make judgements based on practice and previous experience. - Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. - Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day-to-day administrative requirements. - Build relationships with stakeholders/customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
We are looking for a skilled and experienced Full Stack Software Engineer to join our team. The ideal candidate should have a strong background in full stack development, specializing in web applications using technologies such as Node.js, React.js, TypeScript, JavaScript, Cypress, MongoDB, Terraform, and AWS services. As a Software Engineer, you will be responsible for designing, testing, and documenting new software products while adhering to development and security standards. You will collaborate with Senior and Principal Engineers to create innovative solutions to meet business needs. The ideal candidate should have a passion for technology and be eager to expand their knowledge in the field of Software Engineering. You will be required to play a key role in the design, development, and deployment of solutions to support our business objectives. Your responsibilities will include taking technical ownership throughout all stages of software development, contributing to the efficiency of the software development lifecycle, refining product features in collaboration with Product and Design teams, designing software components using appropriate techniques, building and maintaining reliable code, providing technical documentation, troubleshooting production issues, and staying updated with the latest web development trends. Specific skills required for this role include designing and maintaining full stack applications, building efficient front-end systems in React.js, creating server-side logic with Node.js, developing scalable backend services with MongoDB, implementing automated testing using Cypress, and designing technical solutions with Terraform and AWS services. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - Minimum of 5 years of experience as a Full Stack Software Engineer. - Hands-on experience with Node.js, React.js, TypeScript, JavaScript, Cypress, MongoDB, Terraform, and AWS. - Proficiency in micro-services architecture, AWS IAM, AWS API Gateway, AWS Lambda, and AWS S3. - Strong understanding of software development principles, design patterns, and RESTful web services. - Experience with cloud platforms, particularly AWS, and Terraform for infrastructure management. - Knowledge of database design, optimization, and testing. - Familiarity with Git, CI/CD pipelines, incident response, and disaster recovery processes. - Working experience in an AGILE environment is a plus. Abilities & Competencies: - Excellent communication and collaboration skills. - Strategic thinking and problem-solving abilities. - Ability to work in a fast-paced environment and meet deadlines. - Leadership skills with a focus on staff development. - Accountability, ownership, and commitment to high-quality service. - Adaptability to changing requirements and effective collaboration with distributed teams.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a GenAI Developer at Vipracube Tech Solutions, you will be responsible for developing and optimizing AI models, implementing AI algorithms, collaborating with cross-functional teams, conducting research on emerging AI technologies, and deploying AI solutions. This full-time role requires 5 to 6 years of experience and is based in Pune, with the flexibility of some work from home. Your key responsibilities will include fine-tuning large language models tailored to marketing and operational use cases, building Generative AI solutions using various platforms like OpenAI (GPT, DALLE, Whisper) and Agentic AI platforms such as LangGraph and AWS Bedrock. You will also be building robust pipelines using Python, NumPy, Pandas, applying traditional ML techniques, handling CI/CD & MLOps, using AWS Cloud Services, collaborating using tools like Cursor, and effectively communicating with stakeholders and clients. To excel in this role, you should have 5+ years of relevant AI/ML development experience, a strong portfolio of AI projects in marketing or operations domains, and a proven ability to work independently and meet deadlines. Join our dynamic team and contribute to creating smart, efficient, and future-ready digital products for businesses and startups.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for leading the migration of on-premises applications to AWS, optimizing cloud infrastructure, and ensuring seamless transitions. Your key responsibilities will include planning and executing migrations of on-prem applications to AWS, utilizing or developing migration tools for large-scale application migrations, designing and implementing automated application migrations, as well as collaborating with cross-functional teams to troubleshoot and resolve migration issues. To succeed in this role, you should have at least 3 years of AWS cloud migration experience, proficiency in Cloud compute (EC2, EKS) and Storage (S3, EBS, EFS), strong knowledge of AWS cloud services and migration tools, and expertise in Terraform. AWS certification would be a plus.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Lead Data Engineer at Mastercard, you will be a key player in the Mastercard Services Technology team, responsible for driving the mission to unlock the potential of data assets by innovating, managing big data assets, ensuring accessibility of data, and enforcing standards and principles in the Big Data space. Your role will involve designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. You will mentor and guide other engineers, foster a culture of curiosity and continuous improvement, and create robust ETL/ELT pipelines that integrate with various systems. Your responsibilities will include decomposing complex problems into scalable components aligned with platform goals, championing best practices in data engineering, collaborating across teams, supporting data governance and quality efforts, and optimizing cloud infrastructure components related to data engineering workflows. You will actively participate in architectural discussions, iteration planning, and feature sizing meetings while adhering to Agile processes. To excel in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You must possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, a strong foundation in data modeling, database design, and performance optimization is required. Experience working with cloud platforms like AWS, Azure, or GCP and knowledge of modern data architectures and data lifecycle management are essential. Furthermore, familiarity with CI/CD practices, version control, and automated testing is crucial. You should demonstrate the ability to mentor junior engineers effectively, possess excellent communication and collaboration skills, and hold a Bachelor's degree in computer science, Engineering, or a related field. Comfort with Agile/Scrum development environments, curiosity, adaptability, problem-solving skills, and a drive for continuous improvement are key traits for success in this role. Experience with integrating heterogeneous systems, building resilient data pipelines across cloud environments, orchestration tools, data governance practices, containerization, infrastructure automation, and exposure to machine learning data pipelines or MLOps will be advantageous. Holding a Master's degree, relevant certifications, or contributions to open-source/data engineering communities will be a bonus.,
Posted 2 months ago
1.0 - 5.0 years
0 Lacs
kochi, kerala
On-site
As a Java Backend Developer in our team specializing in the IoT domain, your role will involve designing, developing, and deploying scalable microservices utilizing Spring Boot, SQL databases, and AWS services. You will play a pivotal role in guiding the backend development team, implementing DevOps best practices, and optimizing cloud infrastructure to ensure high-performance and secure services. Your key responsibilities will include architecting and implementing high-performance backend services using Java (Spring Boot), developing RESTful APIs and event-driven microservices with a focus on scalability and reliability, designing and optimizing SQL databases (PostgreSQL, MySQL), and deploying applications on AWS utilizing services like ECS, Lambda, RDS, S3, and API Gateway. In addition, you will be tasked with implementing CI/CD pipelines using tools such as GitHub Actions, Jenkins, or similar, monitoring and optimizing backend performance, ensuring best practices for security, authentication, and authorization using OAuth, JWT, and IAM roles, and collaborating with the team to maintain high standards of efficiency and quality. The ideal candidate will possess expertise in Java (Spring Boot, Spring Cloud, Spring Security), microservices architecture, API development, SQL (PostgreSQL, MySQL), ORM (JPA, Hibernate), DevOps tools (Docker, Kubernetes, Terraform, CI/CD, GitHub Actions, Jenkins), AWS cloud services (EC2, Lambda, ECS, RDS, S3, IAM, API Gateway, CloudWatch), messaging systems (Kafka, RabbitMQ, SQS, MQTT), testing frameworks (JUnit, Mockito, Integration Testing), and logging & monitoring tools (ELK Stack, Prometheus, Grafana). Preferred skills that would be beneficial for this role include experience in the IoT domain, previous work experience in startups, familiarity with event-driven architecture using Apache Kafka, knowledge of Infrastructure as Code (IaC) with Terraform, and exposure to serverless architectures. In return, we offer a competitive salary with performance-based incentives, the opportunity to lead and mentor a high-performing tech team, hands-on experience with cutting-edge cloud and microservices technologies, and a collaborative, fast-paced work environment where your skills and expertise will be valued and further developed. If you have experience in any IoT domain and are enthusiastic about contributing to a dynamic team focused on innovation and excellence, we invite you to apply for this full-time, on-site/hybrid Java Backend Developer position in Kochi.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Senior Lead Engineer specializing in Python and Spark in AWS, you will be responsible for designing, building, and maintaining robust, scalable, and efficient ETL pipelines. Your primary focus will be on ensuring alignment with data lakehouse architecture on AWS and optimizing workflows using services such as Glue, Lambda, and S3. Collaborating with cross-functional teams, you will gather requirements, provide technical insights, and deliver high-quality data solutions. Your role will involve driving the migration of existing data processing workflows to the lakehouse architecture, leveraging Iceberg capabilities, and enforcing best practices for coding standards and system architecture. You will play a key role in implementing data quality and governance frameworks to ensure reliable and consistent data processing across the platform. Monitoring and improving system performance, optimizing data workflows, and ensuring all solutions are secure, compliant, and meet industry standards will be crucial aspects of your responsibilities. Leading technical discussions, mentoring team members, and fostering a culture of continuous learning and innovation are essential for this role. You will also maintain relationships with senior management, architectural groups, development managers, team leads, data engineers, analysts, and agile team members. Key Skills and Experience: - Extensive expertise in Python and Spark for designing and implementing complex data processing workflows. - Strong experience with AWS services such as Glue, Lambda, S3, and EMR, focusing on data lakehouse solutions. - Deep understanding of data quality frameworks, data contracts, and governance processes. - Ability to design and implement scalable, maintainable, and secure architectures using modern data technologies. - Hands-on experience with Apache Iceberg and its integration within data lakehouse environments. - Expertise in problem-solving, performance optimization, and Agile methodologies. - Excellent interpersonal skills with the ability to communicate complex technical solutions effectively. Desired Skills and Experience: - Familiarity with additional programming languages such as Java. - Experience with serverless computing paradigms. - Knowledge of data visualization or reporting tools for stakeholder communication. - Certification in AWS or data engineering (e.g., AWS Certified Data Analytics, Certified Spark Developer). Education and Certifications: - A bachelor's degree in Computer Science, Software Engineering, or a related field is helpful. - Equivalent professional experience or certifications will also be considered. Join us at LSEG, a leading global financial markets infrastructure and data provider, where you will be part of a dynamic organization across 65 countries. We value individuality, encourage new ideas, and are committed to sustainability, driving sustainable economic growth and inclusivity. Experience the critical role we play in re-engineering the financial ecosystem and creating economic opportunities while accelerating the transition to net zero. At LSEG, we offer tailored benefits including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As an experienced Python Developer, you will be an integral part of our dynamic team. Your expertise in Python, MySQL, and Django will be crucial as you develop and maintain applications. Familiarity with AWS services like EC2, Lambda, and S3 is highly desirable for this role. Your responsibilities will include working on scalable web applications, optimizing database interactions, and collaborating with cross-functional teams to deliver high-quality solutions. You will be responsible for developing, testing, and maintaining robust Python applications using Django. Designing, implementing, and optimizing MySQL databases to support scalable applications will also be a key part of your role. Collaboration with frontend developers, product owners, and other team members to implement new features and functionality is essential. Utilizing AWS services for deployment, storage, and server management when needed is also expected. Your code should be clean, maintainable, and well-documented while following industry best practices. Troubleshooting, debugging, and optimizing performance issues to enhance application responsiveness will be part of your routine. Staying updated with the latest trends and technologies in web development and cloud services is crucial for your success in this role. To excel in this position, you should have proven experience in Python development, especially focusing on backend systems and web frameworks like Django. Proficiency in MySQL with a strong understanding of database design and optimization techniques is required. Experience with AWS services like EC2, Lambda, and S3 will be advantageous. A solid understanding of the software development lifecycle and agile methodologies is necessary. Your ability to work independently as well as part of a team, coupled with strong problem-solving skills and attention to detail, will be key attributes in this role. A Bachelor's degree in Computer Science, Information Technology, or a related field is preferred. Experience in version control systems (e.g., Git) and continuous integration/deployment (CI/CD) pipelines would be a plus. This is a full-time position that requires in-person work. Join us and be part of a team that values your expertise and contribution to delivering high-quality solutions.,
Posted 2 months ago
3.0 - 5.0 years
25 - 40 Lacs
Bengaluru
Hybrid
The Modern Data Engineer is responsible for designing, implementing, and maintaining scalable data architectures using cloud technologies, primarily on AWS, to support the next evolutionary stage of the Investment Process. They build robust data pipelines, optimize data storage, and access patterns, and ensure data quality while collaborating across engineering teams to deliver high-value data products Key Responsibilities • Implement and maintain data pipelines for ingestion, transformation, and delivery • Ensure data quality through validation and monitoring processes • Collaborate with senior engineers to design scalable data solutions • Work with business analysts to understand and implement data requirements • Optimize data models and queries for performance and efficiency • Follow engineering best practices and contribute to team standards • Participate in code reviews and knowledge sharing activities • Implement data security controls and access policies • Troubleshoot and resolve data pipeline issues Core Technical Skills Cloud Platforms: Proficient with cloud-based data platforms (Snowflake, data lakehouse architecture) AWS Ecosystem : Strong knowledge of AWS services including Lambda, Glue, and S3 Streaming Architecture : Understanding of event-based or streaming data concepts using Kafka Programming: Strong proficiency in Python and SQL DevOps : Experience with CI/CD pipelines and infrastructure as code (Terraform) Data Security: Knowledge of implementing basic data access controls Database Systems : Experience with RDBMS (Oracle, Postgres, MSSQL) and exposure to NoSQL databases Data Integration : Understanding of data integration patterns and techniques Orchestration : Experience using workflow tools (Airflow, Control-M, etc.) Engineering Practices : Experience with GitHub, code verification, and validation Domain Knowledge: Basic knowledge of investment management industry concepts Core Technical Skills Cloud Platforms: Proficient with cloud-based data platforms (Snowflake, data lakehouse architecture) AWS Ecosystem : Strong knowledge of AWS services including Lambda, Glue, and S3 Streaming Architecture : Understanding of event-based or streaming data concepts using Kafka Programming: Strong proficiency in Python and SQL DevOps : Experience with CI/CD pipelines and infrastructure as code (Terraform) Data Security: Knowledge of implementing basic data access controls Database Systems : Experience with RDBMS (Oracle, Postgres, MSSQL) and exposure to NoSQL databases Data Integration : Understanding of data integration patterns and techniques Orchestration : Experience using workflow tools (Airflow, Control-M, etc.) Engineering Practices : Experience with GitHub, code verification, and validation Domain Knowledge: Basic knowledge of investment management industry concepts
Posted 2 months ago
3.0 - 8.0 years
6 - 12 Lacs
Noida
Work from Office
We are looking for an experienced Java Developer with strong experience in Amazon Web Services (AWS) to join our dynamic development team. As a Java AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications and services. You will work on a variety of AWS cloud technologies and Java frameworks to deliver high-performance, scalable, and secure solutions. Responsibilities: Develop, test, and deploy Java-based applications using AWS cloud services. Collaborate with cross-functional teams to design and implement cloud-native applications, microservices, and solutions using AWS services (EC2, S3, Lambda, RDS, SQS, SNS, etc.). Leverage AWS DevOps tools (such as Code Pipeline, Code Deploy, CloudFormation, etc.) to automate deployments and infrastructure provisioning. Write clean, efficient, and maintainable code using Java and related frameworks (Spring Boot, Hibernate). Troubleshoot, debug, and optimize applications running on AWS environments. Create and maintain detailed technical documentation. Monitor and improve the performance, scalability, and security of AWS-based applications. Follow best practices for cloud architecture and software development processes. Stay up to date with the latest AWS services and Java technologies. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). 3+ years of professional experience in Java development. 3+ years of hands-on experience with AWS services (EC2, S3, Lambda, RDS, etc.). Strong proficiency in Java programming language and Java frameworks like Spring Boot, Hibernate, etc. Experience with microservices architecture, containerization (Docker, Kubernetes), and RESTful API development. Familiarity with CI/CD pipelines and tools like Jenkins, Git, and AWS Code Pipeline. Experience with cloud security practices and ensuring the security of AWS services and applications. Knowledge of database management systems (SQL/NoSQL) and data modeling. Strong problem-solving skills and ability to troubleshoot complex issues in cloud environments. Excellent communication and collaboration skills. Preferred Skills: AWS certifications (AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate). Experience with infrastructure as code (IaC) tools like Terraform or CloudFormation. Familiarity with Agile methodologies and working in Agile teams. Knowledge of monitoring and logging tools like CloudWatch, ELK Stack, or similar. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Fundamental Technical Skills - Programming Multithreading Collections Database - Sql Server - SQL Packages Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Middleware - Java Middleware - Springboot Programming Language - Java - OOPS Concepts
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Noida
Work from Office
Full-stack developer with 6-8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles. Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in micro services architecture and implementation Strong knowledge of HTML, CSS and JavaScript, Angular Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, Dynamo DB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., GIT) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc. Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, GIT, Sonar Qube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java Full Stack - HTML/CSS Programming Language - Java - Spring Framework Programming Language - Java - Hibernate Programming Language - Java Full Stack - JavaScript Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Database - Oracle - PL/SQL Packages Development Tools and Management - Development Tools and Management - CI/CD Programming Language - Java Full Stack - Angular Material Programming Language - Java Full Stack - Spring Framework Middleware - Java Middleware - Springboot Middleware - API Middleware - Microservices Middleware - API Middleware - WebServies (REST, SOAP) Middleware - API Middleware - API (SOAP, REST) Agile - Agile - SCRUM Database - Sql Server - SQL Packages
Posted 2 months ago
8.0 - 12.0 years
10 - 15 Lacs
Pune
Work from Office
Key Responsibilities Design and develop scalable applications using Python and AWS services Debug and resolve production issues across complex distributed systems Architect solutions aligned with business strategies and industry standards Lead and mentor a team of India-based developers; guide career development Ensure technical deliverables meet highest standards of quality and performance Research and integrate emerging technologies and processes into development strategy Document solutions in compliance with SDLC standards using defined templates Assemble large, complex datasets based on functional and non-functional requirements Handle operational issues and recommend improvements in technology stack Facilitate end-to-end platform integration across enterprise-level applications Required Skills Technical Skills Cloud & Architecture Tools & Processes Python AWS (EC2, EKS, Glue, Lambda, S3, EMR, RDS, API Gateway) Terraform, CI/CD pipelines Data Engineering Step Functions, CloudFront EventBridge, ARRFlow, Airflow (MWAA), Quicksight Debugging & Troubleshooting System Integration SDLC, Documentation Templates Qualifications 10+ years of software development experience, preferably in financial/trading applications 5+ years of people management and mentoring experience Proven track record in technical leadership and architecture planning Expertise in developing applications using Python and AWS stack Strong grasp of Terraform and automated CI/CD processes Exceptional multitasking and prioritization capabilities
Posted 2 months ago
8.0 - 13.0 years
0 - 1 Lacs
Chennai
Hybrid
Duties and Responsibilities Lead the design and implementation of scalable, secure, and high-performance solutions for data-intensive applications. Collaborate with stakeholders, other product development groups and software vendors to identify and define solutions for complex business and technical requirements. Develop and maintain cloud infrastructure using platforms such as AWS, Azure, or Google Cloud. Articulate technology solutions as well as explain the competitive advantages of various technology alternatives. Evangelize best practices to analytics teams Ensure data security, privacy, and compliance with relevant regulations. Optimize cloud resources for cost-efficiency and performance. Lead the migration of on-premises data systems to the cloud. Implement data storage, processing, and analytics solutions using cloud-native services. Monitor and troubleshoot cloud infrastructure and data pipelines. Stay updated with the latest trends and best practices in cloud computing and data management" Skills 5+ years of hands-on design and development experience in implementing Data Analytics applications using AWS Services such as S3, Glue, AWS Step Functions, Kinesis, Lambda, Lake Formation, Athena, Elastic Container Service/Elastic Kubernetes Service, Elastic Search, and Amazon EMR or Snowflake Experience with AWS services such as AWS IoT Greengrass, AWS IoT SiteWise, AWS IoT Core, AWS IoT Events-Strong understanding of cloud architecture principles and best practices. Proficiency in designing network topology, endpoints, application registration, network pairing Well verse with the access management in Azure or Cloud Experience with containerization technologies like Docker and Kubernetes. Expertise in CI/CD pipelines and version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and leadership skills. Ability to work collaboratively with cross-functional teams and stakeholders. Knowledge of security and compliance standards related to cloud data platforms." Technical / Functional Skills Atleast 3+ years of experience in the implementation of all the Amazon Web Services (listed above) Atleast 3+ years of experience as a SAP BW Developer Atleast 3+ years of experience in Snowflake (or Redshift) Atleast 3+ years of experience as Data Integration Developer in Fivetran/HVR/DBT, Boomi (or Talend/Infomatica) Atleast 2+ years of experience with Azure Open AI, Azure AI Services, Microsoft CoPilot Studio, PowerBI, PowerAutomate Experience in Networking and Security Domain Expertise: 'Epxerience with SDLC/Agile/Scrum/Kanban. Project Experience Hands on experience in the end-to-end implementation of Data Analytics applications on AWS Hands on experience in the end to end implementation of SAP BW application for FICO, Sales & Distribution and Materials Management Hands on experience with Fivetran/HVR/Boomi in development of data integration services with data from SAP, SalesForce, Workday and other SaaS applications Hands on experience in the implementation of Gen AI use cases using Azure Services Hands on experience in the implementation of Advanced Analytics use cases using Python/R Certifications AWS Certified Solutions Architect - Professional
Posted 2 months ago
8.0 - 10.0 years
40 - 45 Lacs
Bengaluru
Hybrid
Position: Senior Data Engineer Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the construction industry forward by transforming data into tangible guidance, driving unparalleled advancement. Dodge is the catalyst for modern construction. https://www.construction.com/ About Symphony Technology Group (STG) STG is a Silicon Valley (California) based private equity firm that has a long and successful track record of transforming high potential software and software-enabled services companies, as well as insights-oriented companies into definitive market leaders. The firm brings expertise, flexibility, and resources to build strategic value and unlock the potential of innovative companies. Partnering to build customer-centric, market winning portfolio companies, STG creates sustainable foundations for growth that bring value to all existing and future stakeholders. The firm is dedicated to transforming and building outstanding technology companies in partnership with world class management teams. With over $5.0 billion in assets under management, including a recently raised $2.0 billion fund. STGs expansive portfolio has consisted of more than 30 global companies. STG Labs is the incubation center for many of STGs portfolio companies, building their engineering, professional services, and support delivery teams in India. STG Labs offers an entrepreneurial start-up environment for software and AI engineers, data scientists and analysts, project and product managers and provides a unique opportunity to work directly for a software or technology company. Based in Bangalore, STG Labs supports hybrid working. https://stg.com Roles and Responsibilities Design, build, and maintain scalable data pipelines and ETL processes leveraging AWS services. Collaborate closely with data architects, business analysts, and DevOps teams to translate business requirements into technical data solutions. Apply SDLC best practices, including planning, coding standards, code reviews, testing, and deployment. Automate workflows and optimize data pipelines for efficiency, performance, and reliability. Implement monitoring and logging to ensure the health and performance of data systems. Ensure data security and compliance through adherence to industry and internal standards. Participate actively in agile development processes and contribute to sprint planning, stand-ups, retrospectives, and documentation efforts. Qualifications Hands-on working knowledge and experience is required in: Data Structures Memory Management Basic Algos (Search, Sort, etc) Hands-on working knowledge and experience is preferred in: Memory Management Algorithms: Search, Sort, etc. AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, Redshift, S3 Scripting & Programming Languages: Python, Bash, SQL Version Control & CI/CD Tools: Git, Jenkins, Bitbucket Database Systems & Data Engineering: Data modeling, data warehousing principles Infrastructure as Code (IaC): Terraform, CloudFormation Containerization & Orchestration: Docker, Kubernetes Certifications Preferred : AWS Certifications (Data Analytics Specialty, Solutions Architect Associate).(Preferred Skill).
Posted 2 months ago
6.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Calfus is a Silicon Valley headquartered software engineering and platforms company that seeks to inspire its team to rise faster, higher, stronger, and work together to build software at speed and scale. The company's core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes while standing for #Equity and #Diversity in its ecosystem and society at large. As a Data Engineer specializing in BI Analytics & DWH at Calfus, you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower the organization to make data-driven decisions. Leveraging expertise in Power BI, Tableau, and ETL processes, you will create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. - Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. - Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. - Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. - Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. - Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications: - Bachelors degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong focus on Power BI and Tableau. - Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management. - Exploratory data analysis with Python. - Familiarity with the CRISP-DM model. - Ability to work with different data models. - Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and/or Dash. - Strong programming foundation with Python for data manipulation and analysis using Pandas, NumPy, PySpark, data serialization & formats like JSON, CSV, Parquet & Pickle, database interaction, data pipeline and ETL tools, cloud services & tools, and code quality and management using version control. - Ability to interact with REST APIs and perform web scraping tasks is a plus. Calfus Inc. is an Equal Opportunity Employer.,
Posted 2 months ago
5.0 - 10.0 years
0 Lacs
delhi
On-site
As a DevOps Engineer at AuditorsDesk, you will play a crucial role in designing, deploying, and maintaining AWS infrastructure using Terraform for provisioning and configuration management. Your primary responsibility will be to implement and manage EC2 instances, including their lifecycle management, scaling, and optimization. Additionally, you will configure and manage application load balancers to ensure efficient traffic distribution and deploy AWS WAF for web application security. Collaboration with development and operations teams is key in integrating security practices throughout the software development lifecycle. You will also be tasked with implementing and maintaining CI/CD pipelines to automate testing and deployment processes. Monitoring system performance and implementing security best practices are vital to ensure high availability and reliability. Your role will also involve conducting security assessments and participating in incident response and resolution activities. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, or possess equivalent experience. Demonstrated experience with AWS services and infrastructure, proficiency in Terraform for infrastructure as code, and hands-on experience with load balancers are essential requirements. Familiarity with containerization technologies, networking concepts, and security protocols will be beneficial. Strong scripting skills in Python and Bash for automation tasks are desired, along with the ability to troubleshoot complex issues and drive problem resolution effectively. Preferred qualifications include AWS certifications like AWS Certified Solutions Architect/AWS Certified DevOps Engineer, experience with infrastructure monitoring tools, and knowledge of compliance frameworks such as PCI-DSS, HIPAA, and GDPR in AWS environments. Excellent communication skills and the ability to collaborate with cross-functional teams are necessary for success in this role. This is a permanent on-site position located in Delhi, requiring 5 to 10 years of relevant experience. The compensation offered is competitive and aligned with industry standards. Join AuditorsDesk to contribute to making audit work paperless and enable efficient collaboration for audit teams and clients.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
At OP, we are a people-first, high-touch organization committed to delivering cutting-edge AI solutions with integrity and passion. We are looking for a Senior AI Developer with expertise in AI model development, Python, AWS, and scalable tool-building. In this role, you will be responsible for designing and implementing AI-driven solutions, developing AI-powered tools and frameworks, and integrating them into enterprise environments, including mainframe systems. As a Senior AI Developer at OP, your key responsibilities will include developing and deploying AI models using Python and AWS for enterprise applications, building scalable AI-powered tools, designing and optimizing machine learning pipelines, implementing NLP and GenAI models, developing and integrating RAG systems for enterprise knowledge retrieval, maintaining AI frameworks and APIs, architecting cloud-based AI solutions using AWS services, writing high-performance Python code for AI applications, and ensuring scalability, security, and performance of AI solutions in production. The required qualifications for this role include 5+ years of experience in AI/ML development with expertise in Python and AWS, a strong background in machine learning and deep learning, experience in LLMs, NLP, and RAG systems, hands-on experience in building and deploying AI models in production, proficiency in cloud-based AI solutions, experience in developing AI-powered tools and frameworks, knowledge of mainframe integration and enterprise AI applications, and strong coding skills with a focus on software development best practices. Preferred qualifications for this role include familiarity with MLOps, CI/CD pipelines, and model monitoring, a background in developing AI-based enterprise tools and automation, and experience with vector databases and AI-powered search technologies. At OP, we offer health insurance, accident insurance, and competitive salaries based on various factors including location, education, qualifications, experience, technical skills, and business needs. In addition to the core responsibilities, you will also be expected to participate in OP monthly team meetings, contribute to technical discussions, peer reviews, and collaborate via the OP-Wiki/Knowledge Base, as well as provide status reports to OP Account Management as requested. OP is a technology consulting and solutions company that offers advisory and managed services, innovative platforms, and staffing solutions across various fields, including AI, cyber security, and enterprise architecture. Our team consists of dynamic, creative thinkers who are passionate about quality work, and as a member of the OP team, you will have access to industry-leading consulting practices, strategies, technologies, training, and education. An ideal OP team member is a technology leader with a proven track record of technical excellence and a strong focus on process and methodology.,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At Lilly, you are part of a global healthcare leader that is committed to uniting caring with discovery to enhance the lives of people worldwide. With our headquarters in Indianapolis, Indiana, our dedicated team of 39,000 employees collaborates to discover and deliver life-changing medicines, enhance disease understanding and management, and contribute to our communities through philanthropy and volunteerism. Our focus is on making a positive impact on people's lives around the world. As part of our ongoing efforts, we are in the process of developing and internalizing a cutting-edge recommendation engine platform. This platform aims to streamline sales and marketing operations by analyzing diverse data sources, implementing advanced personalization models, and seamlessly integrating with other Lilly operations platforms. The goal is to provide tailored recommendations to our sales and marketing teams at the individual doctor level, enabling informed decision-making and enhancing customer experience. Responsibilities: - Utilize deep learning models to optimize Omnichannel Promotional Sequences for sales teams - Analyze large datasets to identify trends and relevant information for modeling decisions - Translate business problems into statistical problem statements and propose solution approaches - Collaborate with stakeholders to effectively communicate analysis findings - Preference for familiarity with pharmaceutical datasets and industry - Experience in code refactoring, model training, deployment, testing, and monitoring for drift - Optimize model hyperparameters and adapt to new ML techniques for business problem-solving Qualifications: - Bachelor's degree in Computer Science, Statistics, or related field (preferred) - 2-6 years of hands-on experience with data analysis, coding, and result interpretation - Proficiency in coding languages like SQL or Python - Prior experience with ML techniques for recommendation engine models in healthcare sectors - Expertise in Feature Engineering, Selection, and Model Validation on Big Data - Familiarity with cloud technology, particularly AWS, and tools like Tableau and Power BI At Lilly, we are committed to promoting workplace diversity and providing equal opportunities for all individuals, including those with disabilities. If you require accommodation during the application process, please complete the accommodation request form on our website. Join us at Lilly and be part of a team dedicated to making a difference in the lives of people worldwide.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 10 years of experience. You should be proficient in setting up, configuring, and integrating API gateways in AWS. Your expertise should include API frameworks, XML/JSON, REST, and data protection in software design, build, test, and documentation. Experience with various AWS services such as Lambda, S3, CDN (CloudFront), SQS, SNS, EventBridge, API Gateway, Glue, and RDS is required. You should be able to articulate and implement projects using these AWS services effectively. Your role will involve improving business processes through effective integration solutions. Location: Bangalore, Chennai, Pune, Mumbai, Noida Notice Period: Immediate joiner If you meet the requirements mentioned above, please apply for this position by filling out the form with your Full Name, Email, Phone, Cover Letter, and uploading your CV/Resume (PDF, DOC, DOCX formats accepted). By submitting this form, you agree to the storage and handling of your data by this website.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |