Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
13 - 18 Lacs
Coimbatore
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : AWS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across the organization, contributing to the overall efficiency and effectiveness of data management practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data storage solutions and architectures.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in AWS Analytics.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
10 - 18 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
About the Role: We are seeking a passionate and experienced Subject Matter Expert and Trainer to deliver our comprehensive Data Engineering with AWS program. This role combines deep technical expertise with the ability to coach, mentor, and empower learners to build strong capabilities in data engineering, cloud services, and modern analytics tools. If you have a strong background in data engineering and love to teach, this is your opportunity to create impact by shaping the next generation of cloud data professionals. Key Responsibilities: Deliver end-to-end training on the Data Engineering with AWS curriculum, including: - Oracle SQL and ANSI SQL - Data Warehousing Concepts, ETL & ELT - Data Modeling and Data Vault - Python programming for data engineering - AWS Fundamentals (EC2, S3, Glue, Redshift, Athena, Kinesis, etc.) - Apache Spark and Databricks - Data Ingestion, Processing, and Migration Utilities - Real-time Analytics and Compute Services (Airflow, Step Functions) Facilitate engaging sessions virtual and in-person and adapt instructional methods to suit diverse learning styles. Guide learners through hands-on labs, coding exercises, and real-world projects. Assess learner progress through evaluations, assignments, and practical assessments. Provide mentorship, resolve doubts, and inspire confidence in learners. Collaborate with the program management team to continuously improve course delivery and learner experience. Maintain up-to-date knowledge of AWS and data engineering best practices. Ideal Candidate Profile: Experience: Minimum 5-8 years in Data Engineering, Big Data, or Cloud Data Solutions. Prior experience delivering technical training or conducting workshops is strongly preferred. Technical Expertise: Proficiency in SQL, Python, and Spark. Hands-on experience with AWS services: Glue, Redshift, Athena, S3, EC2, Kinesis, and related tools. Familiarity with Databricks, Airflow, Step Functions, and modern data pipelines. Certifications: AWS certifications (e.g., AWS Certified Data Analytics Specialty) are a plus. Soft Skills: Excellent communication, facilitation, and interpersonal skills. Ability to break down complex concepts into simple, relatable examples. Strong commitment to learner success and outcomes. Email your application to: careers@edubridgeindia.in.
Posted 2 months ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad
Work from Office
8-12 years of Cloud software development experience under Agile development life cycle processes and tools Experience in Micro-services architecture. Experience in NodeJS applications Strong knowledge of AWS Cloud Platform services like AWS Lambda, GraphQL, NoSQL databases, AWS Kinesis, Queues like SNS/ SQS topics Strong with Object Oriented Analysis & Design (OOAD) ; Programming languages used for cloud Java , C++ , GO Strong experience on PaaS, IaaS cloud computing. Good hands-on experience in Serverless frameworks, AWS JS SDK, AWS services like Lambda, SNS, SES, SQS, SSM, S3, EC2, IAM, CloudWatch, Kinesis and Cloud Formation. Experience in Agile methodology with tools like JIRA, GIT, GITLAB, SVN, Bit Bucket as an active scrum member. Good to have Okta or oAuth2 knowledge. IoTs, Sparkplug-B knowledge/ work experience is added advantage.
Posted 2 months ago
16.0 - 21.0 years
18 - 22 Lacs
Gurugram
Work from Office
About the Role: OSTTRA India The Role Enterprise Architect - Integration The Team The OSTTRA Technology teamis composed of Capital Markets Technology professionals, who build,supportand protect the applications that operate our network. The technology landscapeincludeshigh-performance, high-volume applications as well as compute intensive applications,leveragingcontemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. Whats in it for you: The current objective is to identify individuals with 16+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. Responsibilities: The role shall be responsible for establishing, maintaining, socialising and realising the target state integration strategy for FX & Securities Post trade businesses of Osttra. This shall encompass the post trade lifecycle of our businesses including connectivity with clients, markets ecosystem and Osttras post trade family of networks and platforms and products. The role shall partner with product architects, product managers, delivery heads and teams for refactoring the deliveries towards the target state. They shall be responsible for the efficiency, optimisation, oversight and troubleshooting of current day integration solutions, platforms and deliveries as well, in addition target state focus. The role shall be expected to produce and maintain integration architecture blueprint. This shall cover current state and propose a rationalised view of target state of end-to-end integration flows and patterns. The role shall also provide for and enable the needed technology platforms/tools and engineering methods to realise the strategy. The role enable standardisation of protocols / formats (at least within Osttra world) , tools and reduce the duplication & non differentiated heavy lift in systems. The role shall enable the documentation of flows & capture of standard message models. Integration strategy shall also include transformation strategy which is so vital in a multi-lateral / party / system post trade world. Role shall partner with other architects and strategies / programmes and enable the demands of UI, application, and data strategies. What Were Looking For Rich domain experience of financial services industry preferably with financial markets, Pre/post trade life cycles and large-scale Buy/Sell/Brokerage organisations Should have experience of leading the integration strategies and delivering the integration design and architecture for complex programmes and financial enterprises catering to key variances of latency / throughput. Experience with API Management platforms (like AWS API Gateway, Apigee, Kong, MuleSoft Anypoint) and key management concepts (API lifecycle management, versioning strategies, developer portals, rate limiting, policy enforcement) Should be adept with integration & transformation methods, technologies and tools. Should have experience of domain modelling for messages / events / streams and APIs. Rich experience of architectural patterns like Event driven architectures, micro services, event streaming, Message processing/orchestrations, CQRS, Event sourcing etc. Experience of protocols or integration technologies like FIX, Swift, MQ, FTP, API etc. . including knowledge of authentication patterns (OAuth, mTLS, JWT, API Keys), authorization mechanisms, data encryption (in transit and at rest), secrets management, and security best practices Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, Protobuf, REST, gRPC, GraphQL etc Experience of technology like Kafka or AWS Kinesis, Spark streams, Kubernetes / EKS, AWS EMR Experience of languages like Java, python and message orchestration frameworks like Apache Camel, Apache Nifi, AWS Step Functions etc. Experience in designing and implementing traceability/observability strategies for integration systems and familiarity with relevant framework tooling. Experience of engineering methods like CI/CD, build deploy automation, Infra as code and integration testing methods and tools Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience and strong understanding of multicloud integration patterns. The LocationGurgaon, India About Company Statement: About OSTTRA Candidates should note that OSTTRAis an independentfirm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global providesrecruitmentservices to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joiningour global team of more than 1,200 posttrade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ yearsMarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets.Learn more atwww.osttra.com. Whats In It For You Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf -----------------------------------------------------------
Posted 2 months ago
9.0 - 13.0 years
32 - 40 Lacs
Ahmedabad
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 3 months ago
4.0 - 7.0 years
25 - 27 Lacs
Bengaluru
Remote
4+ YOE as a Data Engineer/Scientist, hands-on experience working on Data Warehousing, Data ingestion, Data processing, Data Lakes Must have strong development experience using Python. and SQL, understanding of data orchestration tools like Airflow Required Candidate profile Experience with data extraction techniques - CDC, batch-based, Debezium, Kafka Connect, AWS DMS, queuing/messaging systems - SQS, RabbitMQ, Kinesis, AWS, Data/ML - AWS Glue, MWAA, Athena, Redshift
Posted 3 months ago
5.0 - 6.0 years
7 - 8 Lacs
Hyderabad
Work from Office
Assurance & Quality (AQ) Engineer DepartmentQuality Assurance / Engineering LocationRemote Role Purpose: To ensure the highest quality standards are met for live streaming and video surveillance solutions, especially in CCTV systems. The AQ Engineer will be responsible for validating compliance with STQC (Standardization Testing and Quality Certification) norms and will perform thorough testing and verification of live video streaming functionalities, encoding/decoding quality, latency, resilience, and compatibility. Key Responsibilities: Conduct functional, performance, and security testing on live streaming and CCTV video systems. Ensure systems comply with STQC guidelines, including for government-grade video surveillance. Validate end-to-end video streaming workflows – capture, encoding, transmission, and playback. Test latency, frame rate consistency, resolution standards, and night vision capabilities. Ensure compatibility across various video codecs (H.264, H.265), streaming protocols (RTSP, RTMP, ONVIF). Develop and maintain automated test scripts for regression and stress testing of video systems. Collaborate with development, network, and security teams to resolve quality issues. Conduct compliance and certification testing to ensure product readiness for deployment. Generate quality reports, issue logs, and root cause analysis documentation. Required Skills & Experience: Bachelor’s Degree in Electronics, Computer Science, or related discipline. STQC Certification or experience working with STQC standards is mandatory . 3–6 years of experience in QA for video surveillance, CCTV, or live streaming environments. Strong understanding of video streaming protocols and CCTV camera technologies. Familiarity with tools like Wireshark, VLC, ONVIF Device Manager, and IP video testers. Experience with CCTV integration platforms, video analytics, and storage systems (NVR/DVR). Good scripting skills (Python, Shell) for test automation preferred. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Knowledge of cybersecurity practices for video surveillance. Experience with cloud-based video management systems (AWS Kinesis Video, Azure Media Services). Exposure to AI-based video analytics and real-time video alert systems. Key Performance Indicators (KPIs): Number of defects identified pre-production Compliance rate with STQC guidelines Stream quality consistency (uptime, resolution, frame drop rate) Test automation coverage and execution speed Turnaround time for issue resolution
Posted 3 months ago
5.0 - 10.0 years
8 - 12 Lacs
Kochi
Work from Office
Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Big Data, Python or R Good to have skills:Scala, SQL Job Summary A Data Scientist is expected to be hands-on to deliver end to end vis a vis projects undertaken in the Analytics space. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Roles and Responsibilities Identify valuable data sources and collection processes Supervise preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns for insurance industry. Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Collaborate with engineering and product development teams Hands-on knowledge of implementing various AI algorithms and best-fit scenarios Has worked on Generative AI based implementations Professional and Technical Skills 3.5-5 years experience in Analytics systems/program delivery; at least 2 Big Data or Advanced Analytics project implementation experience Experience using statistical computer languages (R, Python, SQL, Pyspark, etc.) to manipulate data and draw insights from large data sets; familiarity with Scala, Java or C++ Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications Hands on experience in Azure/AWS analytics platform (3+ years) Experience using variations of Databricks or similar analytical applications in AWS/Azure Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Strong mathematical skills (e.g. statistics, algebra) Excellent communication and presentation skills Deploying data pipelines in production based on Continuous Delivery practices. Additional Information Multi Industry domain experience Expert in Python, Scala, SQL Knowledge of Tableau/Power BI or similar self-service visualization tools Interpersonal and Team skills should be top notch Nice to have leadership experience in the past Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)
Posted 3 months ago
2.0 - 4.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Description Enphase Energy is a global energy technology company and leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, Enphase transformed the solar industry with our revolutionary microinverter technology, which turns sunlight into a safe, reliable, resilient, and scalable source of energy to power our lives. Today, the Enphase Energy System helps people make, use, save, and sell their own power. Enphase is also one of the fastest growing and innovative clean energy companies in the world, with approximately 68 million products installed across more than 145 countries. We are building teams that are designing, developing, and manufacturing next-generation energy technologies and our work environment is fast-paced, fun and full of exciting new projects. If you are passionate about advancing a more sustainable future, this is the perfect time to join Enphase! About the role At Enphase, we think big. We re on a mission to bring solar energy to the next level, one where it s ready to meet the energy demands of an entire globe. As we work towards our vision for a solar-powered planet, we need visionary and talented people to join our team as Senior Back-End engineers. The Back-End engineer will develop, maintain, architect expand cloud microservices for the EV (Electric Vehicle) Business team. Codebase uses Java, Spring Boot, Mongo, REST APIs, MySQL. Applications are dockized and hosted in AWS using a plethora of AWS services. What you will be doing Programming in Java + Spring Boot REST API with JSON, XML etc. for data transfer Multiple database proficiency including SQL and NoSQL (Cassandra, MongoDB) Ability to develop both internal facing and external facing APIs using JWT and OAuth2.0 Familiar with HA/DR, scalability, performance, code optimizations Experience with working with highly performance and throughput systems Ability to define, track and deliver items to one s own schedule. Good organizational skills and the ability to work on more than one project at a time. Exceptional attention to detail and good communication skills Who you are and what you bring B.E/B.Tech in Computer Science from top tier college and >70% marks More than 4 years of overall Back-End development experience Experience with SQL + NoSQL (Preferably MongoDB) Experience with Amazon Web Services, JIRA, Confluence, GIT, Bitbucket etc. Ability to work independently and as part of a project team. Strong organizational skills, proactive, and accountable Excellent critical thinking and analytical problem-solving skills Ability to establish priorities and proceed with objectives without supervision. Ability to communicate effectively and accurately. clear concise written project status update throughout the project lifecycle Highly skilled at facilitating and documenting requirements Excellent facilitation, collaboration, and presentation skills Comfort with ambiguity, frequent change, or unpredictability Good Practice of writing clean and scalable code Exposure or knowledge in Renewable Tech companies Good understanding of cloud technologies, such as Docker, Kubernetes, EKS, Kafka, AWS Kinesis etc. Knowledge of NoSQL Database systems like MongoDB or CouchDB, including Graph Databases Ability to work in a fast-paced environment. Exposure or knowledge in Renewable Tech companies
Posted 3 months ago
0.0 - 3.0 years
3 - 6 Lacs
Hyderabad
Work from Office
In this vital role you will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and report of in vitro assays and in vivo / pre-clinical studies as well as those that manage compound inventories / biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Identify and resolve technical challenges effectively Work closely with product team, business team including scientists, and other stakeholders What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Bachelors degree and 0 to 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 4 to 7 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications: Experience in implementing and supporting biopharma scientific software platforms Functional Skills: Must-Have Skills: Proficient in a General Purpose High Level Language (e.g. Python, Java, C#.NET) Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in a SQL (e.g. Oracle, PostGres, Databricks) Experience with event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Good-to-Have Skills: Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with Benchling Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills.
Posted 3 months ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Twilio Segment Customer Engagement Platform Good to have skills : AWS Analytics, Mule Enterprise Service BusMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing application features, and ensuring that the solutions align with the overall business objectives. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Implement best practices in coding and application design to ensure high-quality deliverables. Professional & Technical Skills: - Must To Have Skills: Proficiency in Twilio Segment Customer Engagement Platform.- Good To Have Skills: Experience with AWS Analytics.- Strong understanding of application development methodologies.- Experience with API integration and management.- Familiarity with cloud-based application deployment and management. Additional Information:- The candidate should have minimum 3 years of experience in Twilio Segment Customer Engagement Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 months ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Twilio Segment Customer Engagement Platform Good to have skills : AWS AnalyticsMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior professionals to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Twilio Segment Customer Engagement Platform.- Good To Have Skills: Experience with AWS Analytics.- Strong understanding of application development methodologies.- Experience with API integration and management.- Familiarity with cloud-based solutions and deployment strategies. Additional Information:- The candidate should have minimum 7.5 years of experience in Twilio Segment Customer Engagement Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 months ago
10.0 - 20.0 years
35 - 60 Lacs
Mumbai, India
Work from Office
Design Full Stack solutions with cloud infrastructure (IAAS, PAAS, SAAS, on Premise, Hybrid Cloud) Support Application and infrastructure design and build as a subject matter expert Implement proof of concepts to demonstrate value of the solution designed Provide consulting support to ensure delivery teams build scalable, extensible, high availability, low latency, and highly usable applications Ensure solutions are aligned with requirements from all stake holders such as Consumers, Business, IT, Security and Compliance Ensure that all Enterprise IT parameters and constraints are considered as part of the design Design an appropriate technical solution to meet business requirements that may involve Hybrid cloud environments including Cloud-native architecture, Microservices, etc. Working knowledge of high availability, low latency end-to-end technology stack is especially important using both physical and virtual load balancing, caching, and scaling technology Awareness of Full stack web development frameworks such as Angular / React / Vue Awareness of relational and no relational / NoSql databases such as MongoDB / MS SQL / Cassandra / Neo4J / DynamoDB Awareness of Data Streaming platforms such as Apache Kafka / Apache Flink / AWS Kinesis Working experience of using AWS Step Functions or Azure Logic Apps with serverless Lambda or Azure Functions Optimizes and incorporates the inputs of specialists in solution design. Establishes the validity of a solution and its components with both short- term and long-term implications. Identifies the scalability options and implications on IT strategy and/or related implications of a solution and includes these in design activities and planning. Build strong professional relationships with key IT and business executives. Be a trusted advisor for Cross functional and Management Teams. Partners effectively with other teams to ensure problem resolution. Provide solutions and advice, create Architectures, PPT. Documents and effectively transfer knowledge to internal and external stakeholders Demonstrates knowledge of public cloud technology & solutions. Applies broad understanding of technical innovations & trends in solving business problems. Manage special projects and strategic initiatives as assigned by management. Implement and assist in developing policies for Information Security, and Environmental compliance, ensuring the highest standards are maintained. Ensure adherence to SLAs with internal and external customers and compliance with Information Security Policies, including risk assessments and procedure reviews.
Posted 3 months ago
4.0 - 8.0 years
10 - 20 Lacs
Nagpur, Pune
Work from Office
Roles & Responsibilities: Develop and maintain backend services using Node.js Implement event-driven architectures with Apache Kafka or AWS Kinesis for real-time data processing Deploy and manage containerized applications using Docker Design and manage MongoDB databases for efficient data storage and retrieval Work with AWS services (e.g., Load balancer, EC2, S3, Lambda, API Gateway) for scalable cloud solutions Integrate MQTT protocols for IoT and messaging-based applications Configure and maintain Linux-based production environments Read and analyse existing Java code for integration and troubleshooting purposes Implement secure authentication using OpenID Connect (OIDC) Collaborate with development teams to improve system reliability and performance Must have Technical Skills: Strong proficiency in Node.js with experience in frameworks like Express.js or NestJS Hands-on experience with Apache Kafka and event-driven systems Experience with AWS services , including compute, storage, and networking solutions Docker & container orchestration experience for scalable deployments Experience with MongoDB , including schema design, indexing, and performance optimization Basic understanding of Java , with the ability to read and analyse code Good to have: Experience with Kubernetes for managing containers Analysis of Quarkus microservices to ensure best practices and efficiency Understanding of Terraform or CloudFormation for infrastructure setup Knowledge of serverless computing (AWS Lambda, Azure Functions) AWS Certification is a plus Key Competencies: Excellent problem-solving skills and attention to detail Strong communication and teamwork skills Ability to work collaboratively in cross-functional teams Ability to write clean, well-documented, and efficient code
Posted 3 months ago
4.0 - 6.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Specialist Software Engineer - Biological Studies (Translational Sciences) What you will do In this vital role you will be responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements LIMS platforms that enable the capture, analysis, storage, and report of pre-clinical and clinical studies as well as those that manage biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Take ownership of complex software projects from conception to deployment Manage software delivery scope, risk, and timeline Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Conduct code reviews to ensure code quality and adherence to best practices Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other stakeholders Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently What we expect of you Masters degree and 4 to 6 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree and 6 to 8 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 10 to 12 years of [Job Codes Discipline and/or Sub-Discipline] experience Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field. Basic Qualifications: Proficient in Java Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in SQL (e.g. Oracle, PostGres, Databricks) Experience with event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Preferred Qualifications: 3+ years of experience in implementing and supporting LIMS platforms Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with STARLIMS, Watson LIMS, LabVantage, or similar LIMS platforms Professional Certifications: AWS Certified Cloud Practitioner (preferred) Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 3 months ago
1.0 - 3.0 years
3 - 5 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements LIMS platforms that enable the capture, analysis, storage, and report of pre-clinical and clinical studies as well as those that manage biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other stakeholders What we expect of you Masters degree and 1 to 3 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree and 3 to 5 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 7 to 9 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field. Basic Qualifications: Proficient in C#.NET Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in SQL (e.g. Oracle, PostGres, Databricks) Experience with Event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Preferred Qualifications: 1+ years of experience in implementing and supporting LIMS platforms Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with STARLIMS, Watson, LabVantage, or similar LIMS platforms Professional Certifications: AWS Certified Cloud Practitioner (preferred) Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 3 months ago
12.0 - 17.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role Application Lead Project Role Description Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills AWS Analytics Good to have skills NA Minimum 12 year(s) of experience is required Educational Qualification 15 years full time education SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process- Ensure effective communication within the team and stakeholders- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Analytics- Strong understanding of cloud computing principles- Experience in designing and implementing scalable applications- Knowledge of data analytics and visualization tools- Hands-on experience with AWS services- Familiarity with DevOps practices Additional Information- The candidate should have a minimum of 12 years of experience in AWS Analytics- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 months ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Snowflake Database Administrator (DBA) Summary: We are seeking a highly skilled and experienced Snowflake Database Administrator (DBA) to join our team. The ideal candidate will be responsible for the administration, management, and optimization of our Snowflake data platform. The role requires strong expertise in database design, performance tuning, security, and data governance within the Snowflake environment. Key Responsibilities: Administer and manage Snowflake cloud data warehouse environments, including provisioning, configuration, monitoring, and maintenance. Implement security policies, compliance, and access controls. Manage Snowflake accounts and databases in a multi-tenant environment. Monitor the systems and provide proactive solutions to ensure high availability and reliability. Monitor and manage Snowflake costs. Collaborate with developers, support engineers and business stakeholders to ensure efficient data integration. Automate database management tasks and procedures to improve operational efficiency. Stay up to date with the latest Snowflake features, best practices, and industry trends to enhance the overall data architecture. Develop and maintain documentation, including database configurations, processes, and standard operating Support disaster recovery and business continuity planning for Snowflake environments. Required Qualifications: Bachelors degree in computer science, Information Technology, or a related field. 5+ years of experience in Snowflake operations and administration. Strong knowledge of SQL, query optimization, and performance tuning techniques. Experience in managing security, access controls, and data governance in Snowflake. Familiarity with AWS. Proficiency in Python or Bash. Experience in automating database tasks using Terraform, CloudFormation, or similar tools. Understanding of data modeling concepts and experience working with structured and semi-structured data (JSON, Avro, Parquet). Strong analytical, problem-solving, and troubleshooting skills. Excellent communication and collaboration abilities. Preferred Qualifications: Snowflake certification (e.g., SnowPro Core, SnowPro Advanced: Architect, Administrator). Experience with CI/CD pipelines and DevOps practices for database management. Knowledge of machine learning and analytics workflows within Snowflake. Hands-on experience with data streaming technologies (Kafka, AWS Kinesis, etc.).
Posted 3 months ago
1.0 - 3.0 years
1 - 3 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Roles & Responsibilities: Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other stakeholders What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Masters degree and 1 to 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications: Experience in implementing and supporting biopharma scientific software platforms Functional Skills: Must-Have Skills: Proficient in a General Purpose High Level Language (e.g. Python, Java, C#.NET) Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in SQL (e.g. Oracle, PostGres, Databricks) Experience with Event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Good-to-Have Skills: Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with the implementation of LIMS/ELN platforms such as Benchling, Revvity, IDBS, STARLIMS, Watson, LabVantage, etc. Experience handling GxP data and system validation, and knowledge of regulatory requirements affecting laboratory data (e.g., FDA 21 CFR Part 11, GLP, GCP) Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 3 months ago
6.0 - 8.0 years
6 - 8 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
In this vital role you will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements LIMS platforms that enable the capture, analysis, storage, and report of pre-clinical and clinical studies as well as those that manage biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Take ownership of complex software projects from conception to deployment Manage software delivery scope, risk, and timeline Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Conduct code reviews to ensure code quality and adherence to best practices Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other stakeholders Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Doctorate degree OR Masters degree and 4 to 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree and 6 to 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 10 to 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications: Experience in implementing and supporting biopharma scientific software platforms Functional Skills: Must-Have Skills: Proficient in a General Purpose High Level Languages (e.g. Python, Java, C#.NET) Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in SQL (e.g. Oracle, PostGres, Databricks) Experience with event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Good-to-Have Skills: Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with the implementation of LIMS/ELN platforms such as Benchling, Revvity, IDBS, STARLIMS, Watson, LabVantage, etc. Experience handling GxP data and system validation, and knowledge of regulatory requirements affecting laboratory data (e.g., FDA 21 CFR Part 11, GLP, GCP) Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 3 months ago
4.0 - 8.0 years
5 - 9 Lacs
Hyderabad, Bengaluru
Work from Office
Whats in it for you? Pay above market standards The role is going to be contract based with project timelines from 2 12 months, or freelancing Be a part of an Elite Community of professionals who can solve complex AI challenges Work location could be: Remote (Highly likely) Onsite on client location Deccan AIs Office: Hyderabad or Bangalore Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities What are the next steps? Register on our Soul AI website
Posted 3 months ago
4.0 - 8.0 years
13 - 17 Lacs
Hyderabad, Bengaluru
Work from Office
Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, Big Query), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities
Posted 3 months ago
15.0 - 20.0 years
13 - 18 Lacs
Pune
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : AWS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across the organization, contributing to the overall efficiency and effectiveness of data management practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data storage solutions and architectures.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in AWS Analytics.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 months ago
3.0 - 8.0 years
3 - 6 Lacs
Pune, Bengaluru
Work from Office
We are seeking a skilled and experienced Druid Developer to design, develop, and maintain real-time data analytics solutions using Apache Druid. The ideal candidate will have hands-on experience working with Druid, a deep understanding of distributed systems, and a passion for processing large-scale datasets. You will play a pivotal role in creating scalable, high-performance systems that enable real-time decision-making. Technical Skills: Strong experience with Apache Druid, including ingestion, query optimizations, and cluster management. Proficiency in real-time data streaming technologies (e.g., Apache Kafka, AWS Kinesis). Experience with data transformation and ETL processes. Knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB). Hands-on experience with cloud platforms (AWS, GCP, Azure) for deploying Druid clusters. Proficiency in programming languages like Java, Python, or Scala. Familiarity with containerization tools like Docker and orchestration tools like Kubernetes.
Posted 3 months ago
5.0 - 10.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Job Summary Join Synechron as a DevSecOps Engineer, a pivotal role designed to enhance our software release lifecycle through robust automation and security practices. As a DevSecOps Engineer, you will contribute significantly to our business objectives by ensuring high-performance and secure infrastructure, facilitating seamless software deployment, and driving innovation within cloud environments. Software Requirements Required: Proficiency in CI/CD tools such as Jenkins, CodePipeline, CodeBuild, CodeCommit Hands-on experience with DevSecOps practices Automation scripting languagesShell scripting, Python (or similar) Preferred: Familiarity with streaming technologiesKafka, AWS Kinesis, AWS Kinesis Data Firehose, Flink Overall Responsibilities Manage the entire software release lifecycle, focusing on build automation and production deployment. Maintain and optimize CI/CD pipelines to ensure reliable software releases across environments. Engage in platform lifecycle improvement from design to deployment, refining processes for operational excellence. Provide pre-go-live support including system design consulting, capacity planning, and launch reviews. Implement and enforce best practices to optimize performance, reliability, security, and cost efficiency. Enable scalable systems through automation and advocate for changes enhancing reliability and speed. Lead priority incident response and conduct blameless postmortems for continuous improvement. Technical Skills (By Category) Programming Languages: RequiredShell scripting, Python PreferredOther automation scripting languages Cloud Technologies: RequiredExperience with cloud design and best practices, particularly AWS Development Tools and Methodologies: RequiredCI/CD tools (Jenkins, CodePipeline, CodeBuild, CodeCommit) PreferredExposure to streaming technologies (Kafka, AWS Kinesis) Security Protocols: RequiredDevSecOps practices Experience Requirements Minimum of 7+ years in infrastructure performance and cloud design roles Proven experience with architecture and design at scale Industry experience in technology or software development environments preferred Alternative pathwaysDemonstrated experience in similar roles across other sectors Day-to-Day Activities Engage in regular collaboration with cross-functional teams to refine deployment strategies Conduct regular system health checks and implement monitoring solutions Participate in strategic meetings to discuss and implement best practices for system reliability Manage deliverables related to software deployment and automation projects Exercise decision-making authority in incident management and system improvement discussions Qualifications Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience) Certifications in DevOps, AWS, or related fields preferred Commitment to continuous learning and professional development in evolving technologies Professional Competencies Critical thinking and problem-solving capabilities to address complex infrastructure challenges Strong leadership and teamwork abilities to foster collaborative environments Excellent communication skills for effective stakeholder management and technical guidance Adaptability to rapidly changing technology landscapes and proactive learning orientation Innovation mindset to drive improvements and efficiencies within cloud environments Effective time and priority management to balance multiple projects and objectives
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |