Jobs
Interviews

903 Parsing Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Reference # 316918BR Job Type Full Time Your role Are you an analytic thinker? Do you enjoy creating valuable insights with data? Do you want to play a key role in transforming our firm into an agile organization? At UBS, we re-imagine the way we work, the way we connect with each other – our colleagues, clients and partners – and the way we deliver value. Being agile will make us more responsive, more adaptable, and ultimately more innovative. We’re looking for a Data Engineer to: transform data into valuable insights that inform business decisions, making use of our internal data platforms and applying appropriate analytical techniques design, model, develop, and improve data pipelines and data products engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using data platform infrastructure effectively develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements. build observability into our solutions, monitor production health, help to resolve incidents, and remediate the root cause of risks and issues understand, represent, and advocate for client needs Your team In our agile operating model, crews are aligned to larger products and services fulfilling client needs and encompass multiple autonomous pods. You’ll be working in the Developer Workspaces Team focusing on providing compute, development environments and tooling to developers and business users. Your expertise comprehensive understanding and ability to apply data engineering techniques, from event streaming and real-time analytics to computational grids and graph processing engines curious to learn new technologies and practices, reuse strategic platforms and standards, evaluate options, and make decisions with long-term sustainability in mind strong command of at least one language among Python, Java, Golang understanding of data management and database technologies including SQL/NoSQL understanding of data products, data structures and data manipulation techniques including classification, parsing, pattern matching experience with Databricks, ADLS, Delta Lake/Tables, ETL tools would be an asset good understanding of engineering practices and software development lifecycle enthusiastic, self-motivated and client-focused About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

Job Description : SDET (Software Development Engineer in Test) Notice Period Requirement: Immediately to 2 Month(Officially) Job Locations: Gurgaon/Delhi Experience: 5 to 8 Years Skills: SDET, Automation, Java programming, Selenium, Cucumber, Rest Assured, API Coding(All Mandatory) Job Type : Full-Time Job Description We are seeking an experienced and highly skilled SDET (Software Development Engineer in Test) to join our Quality Engineering team. The ideal candidate will possess a strong background in test automation with API testing or mobile testing or Web, with hands-on experience in creating robust automation frameworks and scripts. This role demands a thorough understanding of quality engineering practices, microservices architecture, and software testing tools. Key Responsibilities : - Design and develop scalable and modular automation frameworks using best industry practices such as the Page Object Model. - Automate testing for distributed, highly scalable systems. - Create and execute test scripts for GUI-based, API, and mobile applications. - Perform end-to-end testing for APIs, ensuring thorough validation of request and response schemas, status codes, and exception handling. - Conduct API testing using tools like Rest Assured, SOAP UI, NodeJS, and Postman, and validate data with serialization techniques (e.g., POJO classes). - Implement and maintain BDD/TDD frameworks using tools like Cucumber, TestNG, or JUnit. - Write and optimize SQL queries for data validation and backend testing. - Integrate test suites into test management systems and CI/CD pipelines using tools like Maven, Gradle, and Git. - Mentor team members and quickly adapt to new technologies and tools. - Select and implement appropriate test automation tools and strategies based on project needs. - Apply design patterns, modularization, and user libraries for efficient framework creation. - Collaborate with cross-functional teams to ensure the quality and scalability of microservices and APIs. Must-Have Skills : - Proficiency in designing and developing automation frameworks from scratch. - Strong programming skills in Java, Groovy, or JavaScript with a solid understanding of OOP concepts. - Hands-on experience with at least one GUI automation tool (desktop/mobile). Experience with multiple tools is an advantage. - In-depth knowledge of API testing and microservices architecture. - Experience with BDD and TDD methodologies and associated tools. - Familiarity with SOAP and REST principles. - Expertise in parsing and validating complex JSON and XML responses. - Ability to create and manage test pipelines in CI/CD environments. Nice-to-Have Skills : - Experience with multiple test automation tools for GUI or mobile platforms. - Knowledge of advanced serialization techniques and custom test harness implementation. - Exposure to various test management tools and automation strategies. Qualifications : - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5 Years+ in software quality engineering and test automation. - Strong analytical and problem-solving skills with attention to detail.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

7 - 8 Lacs

Hyderābād

On-site

Position Title: Data Engineer Location: Hyderabad Grade: L3-1 Hiring Manager: Sabya DG About the Job At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as a Data Engineer and you can help make it happen. What you will be doing: Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As a Data Engineer, you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data & AI Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities: Data Product Engineering: Provide input into the engineering feasibility of developing specific R&D Data/AI Products Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data Collaborate with Data/AI Product Owner and Scrum Master to share Progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories Optimize data workflows to drive high performance and reliability of implemented data products Oversee and support junior engineer with Data/AI Product testing requirements and execution Innovation & Team Collaboration: Stay current on industry trends, emerging technologies, and best practices in data product engineering Contribute to a team culture of innovation, collaboration, and continuous learning within the product team About You: Key Functional Requirements & Qualifications: Bachelor’s degree in software engineering or related field, or equivalent work experience 2-3 years of experience in data engineering, software engineering, or other related fields Experience working in life science/pharmaceutical industry and understanding of R&D business preferred Excellent communication and collaboration skills Working knowledge and comfort working with Agile methodologies Key Technical Requirements & Qualifications: Experience in cloud-based data platforms and analytics engineering stack (AWS, Snowflake and DBT) Experience with job scheduling and orchestration (Airflow is a plus) Working knowledge of scripting languages (Python, Shell scripting) Good knowledge of SQL and relational databases technologies/concepts Understanding of data structures and algorithms Experience working with data models and query tuning Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Applications received after the official close date will be reviewed on an individual basis.

Posted 3 weeks ago

Apply

3.0 years

2 - 7 Lacs

Chennai

On-site

An Amazing Career Opportunity for AI/ML Engineer Location: Chennai, India (Hybrid) Job ID: 39582 Position Summary A rewarding career at HID Global beckons you! We are looking for an AI/ML Engineer , who is responsible for designing, developing, and deploying advanced AI/ML solutions to solve complex business challenges. This role requires expertise in machine learning, deep learning, MLOps, and AI model optimization , with a focus on building scalable, high-performance AI systems. As an AI/ML Engineer , you will work closely with data engineers, software developers, and business stakeholders to integrate AI-driven insights into real-world applications. You will be responsible for model development, system architecture, cloud deployment, and ensuring responsible AI adoption . We are a leading company in the trusted source for innovative HID Global Human Resources products, solutions and services that help millions of customers around the globe create, manage and use secure identities. Who are we? HID powers the trusted identities of the world’s people, places, and things, allowing people to transact safely, work productively and travel freely. We are a high-tech software company headquartered in Austin, TX, with over 4,000 worldwide employees. Check us out: www.hidglobal.com and https://youtu.be/23km5H4K9Eo LinkedIn: www.linkedin.com/company/hidglobal/mycompany/ About HID Global, Chennai HID Global powers the trusted identities of the world’s people, places and things. We make it possible for people to transact safely, work productively and travel freely. Our trusted identity solutions give people secure and convenient access to physical and digital places and connect things that can be accurately identified, verified and tracked digitally. Millions of people around the world use HID products and services to navigate their everyday lives, and over 2 billion things are connected through HID technology. We work with governments, educational institutions, hospitals, financial institutions, industrial businesses and some of the most innovative companies on the planet. Headquartered in Austin, Texas, HID Global has over 3,000 employees worldwide and operates international offices that support more than 100 countries. HID Global® is an ASSA ABLOY Group brand. For more information, visit www.hidglobal.com. HID Global has is the trusted source for secure identity solutions for millions of customers and users around the world. In India, we have two Engineering Centre (Bangalore and Chennai) over 200+ Engineering Staff. Global Engineering Team is based in Chennai and one of the Business Unit Engineering team is based in Bangalore. Physical Access Control Solutions (PACS) HID's Physical Access Control Solutions Business Area: HID PAC’s Business Unit focuses on the growth of new clients and existing clients where we leverage the latest card and reader technologies to solve the security challenges of our clients. Other areas of focus will include authentication, card sub systems, card encoding, Biometrics, location services and all other aspects of a physical access control infrastructure. Qualifications:- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Roles & Responsibilities: Design, develop, and deploy robust & scalable AI/ML models in Production environments. Collaborate with business stakeholders to identify AI/ML opportunities and define measurable success metrics. Design and build Retrieval-Augmented Generation (RAG) pipelines integrating vector stores, semantic search, and document parsing for domain-specific knowledge retrieval. Integrate Multimodal Conversational AI platforms (MCP) including voice, vision, and text to deliver rich user interactions. Drive innovation through PoCs, benchmarking, and experiments with emerging models and architectures. Optimize models for performance, latency and scalability. Build data pipelines and workflows to support model training and evaluation. Conduct research & experimentation on the state-of-the-art techniques (DL, NLP, Time series, CV) Partner with MLOps and DevOps teams to implement best practices in model monitoring, version and re-training. Lead code reviews, architecture discussions and mentor junior & peer engineers. Architect and implement end-to-end AI/ML pipelines, ensuring scalability and efficiency. Deploy models in cloud-based (AWS, Azure, GCP) or on-premises environments using tools like Docker, Kubernetes, TensorFlow Serving, or ONNX Ensure data integrity, quality, and preprocessing best practices for AI/ML model development. Ensure compliance with AI ethics guidelines, data privacy laws (GDPR, CCPA), and corporate AI governance. Work closely with data engineers, software developers, and domain experts to integrate AI into existing systems. Conduct AI/ML training sessions for internal teams to improve AI literacy within the organization. Strong analytical and problem solving mindset. Technical Requirements: Strong expertise in AI/ML engineering and software development. Strong experience with RAG architecture, vector databases Proficiency in Python and hands-on experience in using ML frameworks (tensorflow, pytorch, scikit-learn, xgboost etc) Familiarity with MCPs like Google Dialogflow, Rasa, Amazon Lex, or custom-built agents using LLM orchestration. Cloud-based AI/ML experience (AWS Sagemaker, Azure ML, GCP Vertex AI, etc.). Solid understanding of AI/ML life cycle – Data preprocessing, feature engineering, model selection, training, validation and deployment. Experience in production grade ML systems (Model serving, APIs, Pipelines) Familiarity with Data engineering tools (SPARK, Kafka, Airflow etc) Strong knowledge of statistical modeling, NLP, CV, Recommendation systems, Anomaly detection and time series forecasting. Hands-on in Software engineering with knowledge of version control, testing & CI/CD Hands-on experience in deploying ML models in production using Docker, Kubernetes, TensorFlow Serving, ONNX, and MLflow. Experience in MLOps & CI/CD for ML pipelines, including monitoring, retraining, and model drift detection. Proficiency in scaling AI solutions in cloud environments (AWS, Azure & GCP). Experience in data preprocessing, feature engineering, and dimensionality reduction. Exposure to Data privacy, Compliance and Secure ML practices Education and/or Experience: Graduation or master’s in computer science or information technology or AI/ML/Data science 3+ years of hands-on experience in AI/ML development/deployment and optimization Experience in leading AI/ML teams and mentoring junior engineers. Why apply? Empowerment: You’ll work as part of a global team in a flexible work environment, learning and enhancing your expertise. We welcome an opportunity to meet you and learn about your unique talents, skills, and experiences. You don’t need to check all the boxes. If you have most of the skills and experience, we want you to apply. Innovation : You embrace challenges and want to drive change. We are open to ideas, including flexible work arrangements, job sharing or part-time job seekers. Integrity: You are results-orientated, reliable, and straightforward and value being treated accordingly. We want all our employees to be themselves, to feel appreciated and accepted. This opportunity may be open to flexible working arrangements. HID is an Equal Opportunity/Affirmative Action Employer – Minority/Female/Disability/Veteran/Gender Identity/Sexual Orientation. We make it easier for people to get where they want to go! On an average day, think of how many times you tap, twist, tag, push or swipe to get access, find information, connect with others or track something. HID technology is behind billions of interactions, in more than 100 countries. We help you create a verified, trusted identity that can get you where you need to go – without having to think about it. When you join our HID team, you’ll also be part of the ASSA ABLOY Group, the global leader in access solutions. You’ll have 63,000 colleagues in more than 70 different countries. We empower our people to build their career around their aspirations and our ambitions – supporting them with regular feedback, training, and development opportunities. Our colleagues think broadly about where they can make the most impact, and we encourage them to grow their role locally, regionally, or even internationally. As we welcome new people on board, it’s important to us to have diverse, inclusive teams, and we value different perspectives and experiences. #LI-HIDGlobal

Posted 3 weeks ago

Apply

0 years

4 - 5 Lacs

Chennai

On-site

Has excellent knowledge of Node JS. Worked with Express JS. Knowledge of ORMs like Drizzle, TypeORM or Prizma Has knowledge of MySQL or PostGreSQL Has worked with REST Apis extensively Has knowledge of XML parsing and construction Good to have SAP exposure About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

India

Remote

GitLab is an open core software company that develops the most comprehensive AI-powered DevSecOps Platform, used by more than 100,000 organizations. Our mission is to enable everyone to contribute to and co-create the software that powers our world. When everyone can contribute, consumers become contributors, significantly accelerating the rate of human progress. This mission is integral to our culture, influencing how we hire, build products, and lead our industry. We make this possible at GitLab by running our operations on our product and staying aligned with our values. Learn more about Life at GitLab. Thanks to products like Duo Enterprise, and Duo Workflow, customers get the benefit of AI at every stage of the SDLC. The same principles built into our products are reflected in how our team works: we embrace AI as a core productivity multiplier. All team members are encouraged and expected to incorporate AI into their daily workflows to drive efficiency, innovation, and impact across our global organization. An Overview Of This Role As a member of the Secret Detection team, you'll be at the forefront of protecting sensitive data by creating specialized tools that prevent, detect, and remediate leaked secrets in code. Our team focuses on the complete secret management lifecycle - from push protection to pipeline-based scanning, providing automated remediation workflows and audit trails when necessary. We’re passionate about embedding security into the development process seamlessly, allowing developers to focus on innovation while we handle security concerns proactively. You'll help developers safeguard their credentials, API keys, and other sensitive information by building sophisticated detection patterns, reducing false positives, and creating seamless remediation paths when secrets are discovered. Your work will enable organizations to quickly identify exposed secrets, understand their impact, and efficiently revoke and rotate compromised credentials. Your impact will be significant and far-reaching, as our solutions protect both GitLab's ecosystem and the sensitive data of thousands of organizations worldwide, preventing costly data breaches before they happen. Some Examples Of Our Projects Prevent secret leaks in source code with GitLab Secret Push Protection Verify validity of secret detection findings What You’ll Do Lead the design and implementation of fullstack features for our Secret Detection offering, contributing to both the frontend (Vue.js) and backend (Ruby on Rails, GraphQL). Write clean, well-tested code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment. Mentor and support fellow engineers, especially those looking to grow into fullstack contributors. Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems. Contribute to code reviews, RFCs, and Proof-of-Concepts that shape the technical direction of the product Recognize impediments to our efficiency as a team ("technical debt"), propose and implement solutions Work async-first with a globally distributed team, while also participating in necessary sync meetings like high level planning, engineering brainstorming sessions and pairing sessions. What You’ll Bring 3+ years of professional experience with Vue.js, GraphQL, and Ruby on Rails. Proven ability to mentor engineers, lead technical initiatives, and drive frontend and fullstack best practices. Knowledge of security concepts, vulnerabilities, mitigation techniques, and secure coding practices is preferred. Background in developing or using security tools or products Hands-on experience with reverse engineering tools such as Ghidra, Binary Ninja, or diffoscope for analyzing, unpacking, and extracting data from compiled binaries and executable files Experience with Go programming language or strong motivation to learn Ability to work across the stack to deliver end-to-end solutions. A strong product mindset and ability to collaborate closely with cross-functional teams including Product, Design and Technical Writing. Demonstrated ability to work closely with other parts of the organization. Excellent written and verbal communication skills, especially in async-first, remote environments. A proactive, self-managing approach to work with a bias for action and ownership. About The Team GitLab’s Secret Detection team is responsible for the Secret Detection feature category. We want to help developers write better code and worry less about common security mistakes. We do this by helping developers easily identify common security issues as code is being contributed, and mitigate these issues proactively. We work closely with the larger GitLab security product suite while maintaining our specialized focus on the unique challenges of secret detection. Our technical stack spans Rails and Go backends, Vue.js frontends, and custom parsing engines that enable efficient and accurate secret identification. We're committed to making sophisticated security tooling accessible to developers of all skill levels. We'd like to continue to expand our capabilities across these workflows, while also continuously improving the result quality across all types of findings our security tools are responsible for detecting. We balance security best practices with practical developer experience to ensure protection doesn't come at the cost of productivity. Thanks to our Transparency value, you can learn more about us on our Team page. How GitLab Will Support You Benefits to support your health, finances, and well-being All remote, asynchronous work environment Flexible Paid Time Off Team Member Resource Groups Equity Compensation & Employee Stock Purchase Plan Growth and Development Fund Parental leave Home office support Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application. Remote-Global The base salary range for this role’s listed level is currently for residents of listed locations only. Grade level and salary ranges are determined through interviews and a review of education, experience, knowledge, skills, abilities of the applicant, equity with other team members, and alignment with market data. See more information on our benefits and equity. Sales roles are also eligible for incentive pay targeted at up to 100% of the offered base salary. California/Colorado/Hawaii/New Jersey/New York/Washington/DC/Illinois/Minnesota pay range $117,600—$252,000 USD Country Hiring Guidelines: GitLab hires new team members in countries around the world. All of our roles are remote, however some roles may carry specific location-based eligibility requirements. Our Talent Acquisition team can help answer any questions about location after starting the recruiting process. Privacy Policy: Please review our Recruitment Privacy Policy. Your privacy is important to us. GitLab is proud to be an equal opportunity workplace and is an affirmative action employer. GitLab’s policies and practices relating to recruitment, employment, career development and advancement, promotion, and retirement are based solely on merit, regardless of race, color, religion, ancestry, sex (including pregnancy, lactation, sexual orientation, gender identity, or gender expression), national origin, age, citizenship, marital status, mental or physical disability, genetic information (including family medical history), discharge status from the military, protected veteran status (which includes disabled veterans, recently separated veterans, active duty wartime or campaign badge veterans, and Armed Forces service medal veterans), or any other basis protected by law. GitLab will not tolerate discrimination or harassment based on any of these characteristics. See also GitLab’s EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know during the recruiting process.

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Title: Python Developer – Web Scraping & Automation Company: Actowiz Solutions Location: Ahmedabad Job Type: Full-time About Us Actowiz Solutions is a leading provider of data extraction, web scraping, and automation solutions. We empower businesses with actionable insights by delivering clean, structured, and scalable data through cutting-edge technology. Role Overview We are looking for a highly skilled Python Developer with expertise in web scraping, automation tools, and related frameworks. Key Responsibilities Design, develop, and maintain scalable web scraping scripts and frameworks. Lead a team of Python developers in project planning, task allocation, and code reviews. Work with tools and libraries such as Scrapy, BeautifulSoup, Selenium, Playwright, Requests, etc. Implement robust error handling, data parsing, and storage mechanisms (JSON, CSV, databases, etc.). Optimize scraping performance and ensure compliance with legal and ethical scraping practices. Research new tools and techniques to improve scraping efficiency and scalability. Requirements 2+ years of experience in Python development with strong expertise in web scraping. Proficiency in scraping frameworks like Scrapy, Playwright, or Selenium. Deep understanding of HTTP, proxies, user agents, browser automation, and anti-bot measures. Experience with REST APIs, asynchronous programming, and multithreading. Familiarity with databases (SQL/NoSQL) and cloud-based data pipelines. Preferred Qualifications Knowledge of DevOps tools (Docker, CI/CD) is a plus. Experience with big data platforms or ETL pipelines is advantageous. Contact us Mobile : 841366964 Email:komal.actowiz@gmail.com Website: https://www.actowizsolutions.com/career.php

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Client: Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital transformation, technology consulting, and business process outsourcing, Business Focus on Digital Engineering, Cloud Services, AI and Data Analytics, Enterprise Applications ( SAP, Oracle, Sales Force ), IT Infrastructure, Business Process Out Source. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru. Offices in over 35 countries. India is a major operational hub, with as its U.S. headquarters. Job Title: Tosca Automation Location: Hyderabad Experience: 3 to 8 years Job Type : Contract Notice Period: Immediate joiners Key skills: In-depth knowledge of TASCA's functionalities, architecture, and how it integrates with other tools. Ability to analyze manual tasks/processes and identify opportunities for automation. Familiarity with scripting languages such as Python, JavaScript, or VBScript (depending on the automation framework used Custom scripting within TASCA workflows Designing and building automated workflows within TASCA Understanding of triggers, conditions, and actions Using REST or SOAP APIs to integrate TASCA with other systems (e.g. CRMs, ERPs, databases) Using tools like Excel, SQL, or XML/JSON for data parsing and transformation. Documenting automation logic, workflows, and changes clearly for team collaboration and future maintenance Troubleshooting automation failures and optimizing performance.

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Hiring: Part-Time Developer – Web Scraping Expertise | Remote | Immediate Joiner Company: PearlThoughts Role: Part-Time Developer – Web Scraping Location: Remote Engagement: Project-Based Start: Immediate PearlThoughts is seeking a skilled Part-Time Developer with strong expertise in web scraping to join our team on a project-based engagement. This is a remote position ideal for individuals who are available to start immediately and can contribute a few hours daily based on project needs. Key Responsibilities Develop and maintain web scraping scripts and automation workflows Extract and structure data from dynamic websites using modern scraping techniques Monitor and update scrapers as websites evolve Ensure scraped data is clean, accurate, and usable Required Skills Proficiency in Python with experience in libraries such as BeautifulSoup, Selenium, Scrapy Understanding of anti-scraping mechanisms and ability to bypass them ethically Experience with handling APIs, data parsing, and storage formats (JSON, CSV) Strong attention to detail, problem-solving skills, and code optimization Ability to deliver results independently and meet timelines Additional Information Work Mode: 100% Remote Compensation: Project-Based (to be discussed during the interview) Availability: Immediate Joiner Preferred We are looking for someone who is dependable, efficient, and passionate about working with real-world data extraction challenges. If this sounds like you, we’d love to connect. Let’s build something valuable together.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Title: Data Engineer Location: Hyderabad Grade: L3-1 Hiring Manager: Sabya DG About The Job At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as a Data Engineer and you can help make it happen. What You Will Be Doing Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As a Data Engineer, you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data & AI Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities Data Product Engineering: Provide input into the engineering feasibility of developing specific R&D Data/AI Products Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data Collaborate with Data/AI Product Owner and Scrum Master to share progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories Optimize data workflows to drive high performance and reliability of implemented data products Oversee and support junior engineer with Data/AI Product testing requirements and execution Innovation & Team Collaboration Stay current on industry trends, emerging technologies, and best practices in data product engineering Contribute to a team culture of innovation, collaboration, and continuous learning within the product team About You Key Functional Requirements & Qualifications: Bachelor’s degree in software engineering or related field, or equivalent work experience 2-3 years of experience in data engineering, software engineering, or other related fields Experience working in life science/pharmaceutical industry and understanding of R&D business preferred Excellent communication and collaboration skills Working knowledge and comfort working with Agile methodologies Key Technical Requirements & Qualifications Experience in cloud-based data platforms and analytics engineering stack (AWS, Snowflake and DBT) Experience with job scheduling and orchestration (Airflow is a plus) Working knowledge of scripting languages (Python, Shell scripting) Good knowledge of SQL and relational databases technologies/concepts Understanding of data structures and algorithms Experience working with data models and query tuning Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Applications received after the official close date will be reviewed on an individual basis. null Pursue Progress . Discover Extraordinary . Join Sanofi and step into a new era of science - where your growth can be just as transformative as the work we do. We invest in you to reach further, think faster, and do what’s never-been-done-before. You’ll help push boundaries, challenge convention, and build smarter solutions that reach the communities we serve. Ready to chase the miracles of science and improve people’s lives? Let’s Pursue Progress and Discover Extraordinary – together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, protected veteran status or other characteristics protected by law.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Has excellent knowledge of Node JS. Worked with Express JS. Knowledge of ORMs like Drizzle, TypeORM or Prizma Has knowledge of MySQL or PostGreSQL Has worked with REST Apis extensively Has knowledge of XML parsing and construction Good to have SAP exposure

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

An Amazing Career Opportunity for AI/ML Engineer Location: Chennai, India (Hybrid) Job ID: 39582 Position Summary A rewarding career at HID Global beckons you! We are looking for an AI/ML Engineer , who is responsible for designing, developing, and deploying advanced AI/ML solutions to solve complex business challenges. This role requires expertise in machine learning, deep learning, MLOps, and AI model optimization , with a focus on building scalable, high-performance AI systems. As an AI/ML Engineer , you will work closely with data engineers, software developers, and business stakeholders to integrate AI-driven insights into real-world applications. You will be responsible for model development, system architecture, cloud deployment, and ensuring responsible AI adoption . We are a leading company in the trusted source for innovative HID Global Human Resources products, solutions and services that help millions of customers around the globe create, manage and use secure identities. Who are we? HID powers the trusted identities of the world’s people, places, and things, allowing people to transact safely, work productively and travel freely. We are a high-tech software company headquartered in Austin, TX, with over 4,000 worldwide employees. Check us out: www.hidglobal.com and https://youtu.be/23km5H4K9Eo LinkedIn: www.linkedin.com/company/hidglobal/mycompany/ About HID Global, Chennai HID Global powers the trusted identities of the world’s people, places and things. We make it possible for people to transact safely, work productively and travel freely. Our trusted identity solutions give people secure and convenient access to physical and digital places and connect things that can be accurately identified, verified and tracked digitally. Millions of people around the world use HID products and services to navigate their everyday lives, and over 2 billion things are connected through HID technology. We work with governments, educational institutions, hospitals, financial institutions, industrial businesses and some of the most innovative companies on the planet. Headquartered in Austin, Texas, HID Global has over 3,000 employees worldwide and operates international offices that support more than 100 countries. HID Global® is an ASSA ABLOY Group brand. For more information, visit www.hidglobal.com . HID Global has is the trusted source for secure identity solutions for millions of customers and users around the world. In India, we have two Engineering Centre (Bangalore and Chennai) over 200+ Engineering Staff. Global Engineering Team is based in Chennai and one of the Business Unit Engineering team is based in Bangalore. Physical Access Control Solutions (PACS) HID's Physical Access Control Solutions Business Area: HID PAC’s Business Unit focuses on the growth of new clients and existing clients where we leverage the latest card and reader technologies to solve the security challenges of our clients. Other areas of focus will include authentication, card sub systems, card encoding, Biometrics, location services and all other aspects of a physical access control infrastructure. Qualifications:- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Roles & Responsibilities: Design, develop, and deploy robust & scalable AI/ML models in Production environments. Collaborate with business stakeholders to identify AI/ML opportunities and define measurable success metrics. Design and build Retrieval-Augmented Generation (RAG) pipelines integrating vector stores, semantic search, and document parsing for domain-specific knowledge retrieval. Integrate Multimodal Conversational AI platforms (MCP) including voice, vision, and text to deliver rich user interactions. Drive innovation through PoCs, benchmarking, and experiments with emerging models and architectures. Optimize models for performance, latency and scalability. Build data pipelines and workflows to support model training and evaluation. Conduct research & experimentation on the state-of-the-art techniques (DL, NLP, Time series, CV) Partner with MLOps and DevOps teams to implement best practices in model monitoring, version and re-training. Lead code reviews, architecture discussions and mentor junior & peer engineers. Architect and implement end-to-end AI/ML pipelines, ensuring scalability and efficiency. Deploy models in cloud-based (AWS, Azure, GCP) or on-premises environments using tools like Docker, Kubernetes, TensorFlow Serving, or ONNX Ensure data integrity, quality, and preprocessing best practices for AI/ML model development. Ensure compliance with AI ethics guidelines, data privacy laws (GDPR, CCPA), and corporate AI governance. Work closely with data engineers, software developers, and domain experts to integrate AI into existing systems. Conduct AI/ML training sessions for internal teams to improve AI literacy within the organization. Strong analytical and problem solving mindset. Technical Requirements: Strong expertise in AI/ML engineering and software development. Strong experience with RAG architecture, vector databases Proficiency in Python and hands-on experience in using ML frameworks (tensorflow, pytorch, scikit-learn, xgboost etc) Familiarity with MCPs like Google Dialogflow, Rasa, Amazon Lex, or custom-built agents using LLM orchestration. Cloud-based AI/ML experience (AWS Sagemaker, Azure ML, GCP Vertex AI, etc.). Solid understanding of AI/ML life cycle – Data preprocessing, feature engineering, model selection, training, validation and deployment. Experience in production grade ML systems (Model serving, APIs, Pipelines) Familiarity with Data engineering tools (SPARK, Kafka, Airflow etc) Strong knowledge of statistical modeling, NLP, CV, Recommendation systems, Anomaly detection and time series forecasting. Hands-on in Software engineering with knowledge of version control, testing & CI/CD Hands-on experience in deploying ML models in production using Docker, Kubernetes, TensorFlow Serving, ONNX, and MLflow. Experience in MLOps & CI/CD for ML pipelines, including monitoring, retraining, and model drift detection. Proficiency in scaling AI solutions in cloud environments (AWS, Azure & GCP). Experience in data preprocessing, feature engineering, and dimensionality reduction. Exposure to Data privacy, Compliance and Secure ML practices Education and/or Experience: Graduation or master’s in computer science or information technology or AI/ML/Data science 3+ years of hands-on experience in AI/ML development/deployment and optimization Experience in leading AI/ML teams and mentoring junior engineers. Why apply? Empowerment: You’ll work as part of a global team in a flexible work environment, learning and enhancing your expertise. We welcome an opportunity to meet you and learn about your unique talents, skills, and experiences. You don’t need to check all the boxes. If you have most of the skills and experience, we want you to apply. Innovation: You embrace challenges and want to drive change. We are open to ideas, including flexible work arrangements, job sharing or part-time job seekers. Integrity: You are results-orientated, reliable, and straightforward and value being treated accordingly. We want all our employees to be themselves, to feel appreciated and accepted. This opportunity may be open to flexible working arrangements. HID is an Equal Opportunity/Affirmative Action Employer – Minority/Female/Disability/Veteran/Gender Identity/Sexual Orientation. We make it easier for people to get where they want to go! On an average day, think of how many times you tap, twist, tag, push or swipe to get access, find information, connect with others or track something. HID technology is behind billions of interactions, in more than 100 countries. We help you create a verified, trusted identity that can get you where you need to go – without having to think about it. When you join our HID team, you’ll also be part of the ASSA ABLOY Group, the global leader in access solutions. You’ll have 63,000 colleagues in more than 70 different countries. We empower our people to build their career around their aspirations and our ambitions – supporting them with regular feedback, training, and development opportunities. Our colleagues think broadly about where they can make the most impact, and we encourage them to grow their role locally, regionally, or even internationally. As we welcome new people on board, it’s important to us to have diverse, inclusive teams, and we value different perspectives and experiences.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Malappuram

On-site

Flutter Developer Intern Company: Cookee Apps LLP Location: On-site (Kozhikode, Kerala, India) Job Type: Internship (Full-time, 6 Months) Schedule: Day shift About Us Cookee Apps LLP is a fast-growing software company that builds innovative web and mobile solutions. We’re passionate about mentoring fresh talent through real-world, hands-on training and support. Position Overview We are seeking a proactive and enthusiastic Flutter Developer Intern for a 6-month, full-time internship. You will gain practical experience building cross‑platform mobile applications using Flutter and Dart, working alongside our front-end, back-end, and UI/UX teams. Key Responsibilities Contribute to the development of mobile apps using Flutter and Dart (expertia.ai). Collaborate with design and backend teams to implement responsive UI/UX and integrate RESTful APIs . Write clean, maintainable, and efficient code. Participate in code reviews, troubleshooting, and bug-fixing to improve app stability and performance (in.indeed.com). Assist in writing unit tests and contribute to documentation. Stay updated with emerging mobile technologies and flutter best practices. Required Skills Strong fundamentals in Dart and Flutter development (expertia.ai, expertia.ai). Basic understanding of mobile development concepts (UI frameworks, state management, navigation). Familiarity with RESTful API integration and JSON parsing (expertia.ai). Proficiency with Git version control. Solid problem-solving abilities and attention to detail. Strong communication skills and collaborative mindset. Preferred Qualifications Pursuing or completed a degree/certification in Computer Science, Software Engineering, or related field. Portfolio or GitHub showcasing Flutter/Dart projects (academic, personal, or hackathon). Experience using state management solutions (e.g. Provider, BLoC, GetX). Exposure to unit testing in Flutter, CI/CD pipelines, or Firebase integration. What We Offer Internship Certificate upon successful completion. Letter of Recommendation for outstanding performers. Real-time exposure to industry-level codebases and agile development processes. Mentorship from senior developers and the possibility of a full-time role post-internship. Duration & Schedule 6 months full-time commitment Day shift , On-site at Kozhikode, Kerala How to Apply Submit your resume , GitHub portfolio , and a brief statement of interest to career@cookee.io Job Type: Internship Schedule: Day shift Work Location: In person

Posted 3 weeks ago

Apply

5.0 years

4 - 8 Lacs

Mohali

On-site

Company Introduction: - A dynamic company headquartered in Australia. Multi awards winner, recognized for excellence in telecommunications industry. Financial Times Fastest-growing Company APAC 2023. AFR (Australian Financial Review) Fast 100 Company 2022. Great promotion opportunities that acknowledge and reward your hard work. Young, energetic and innovative team, caring and supportive work environment. About You: We are seeking an experienced and highly skilled Data Warehouse Engineer to join our data and analytics team. Data Warehouse Engineer with an energetic 'can do' attitude to be a part of our dynamic IT team. The ideal candidate will have over 5 years of hands-on experience in designing, building, and maintaining scalable data pipelines and reporting infrastructure. You will be responsible for managing our data warehouse, automating ETL workflows, building dashboards, and enabling data-driven decision-making across the organization. Your Responsibilities will include but is not limited to: Design, implement, and maintain robust, scalable data pipelines using Apache NiFi, Airflow, or similar ETL tools. Develop and manage efficient data ingestion and transformation workflows, including web data crawling using Python. Create, optimize, and maintain complex SQL queries to support business reporting needs. Build and manage interactive dashboards and visualizations using Apache Superset (preferred), Power BI, or Tableau. Collaborate with business stakeholders and analysts to gather requirements, define KPIs, and deliver meaningful data insights. Ensure data accuracy, completeness, and consistency through rigorous quality assurance processes. Maintain and optimize the performance of the data warehouse, supporting high-availability and fast query response times. Document technical processes and data workflows for maintainability and scalability. To be successful in this role you will ideally possess: 5+ years of experience in data engineering, business intelligence, or a similar role. Strong proficiency in Python, particularly for data crawling, parsing, and automation tasks. Expert in SQL (including complex joins, CTEs, window functions) for reporting and analytics. Hands-on experience with Apache Superset (preferred), or equivalent BI tools like Power BI or Tableau. Proficient with ETL tools such as Apache NiFi, Airflow, or similar data pipeline frameworks. Experience working with cloud-based data warehouse platforms (e.g., Amazon Redshift, Snowflake, BigQuery, or PostgreSQL). Strong understanding of data modeling, warehousing concepts, and performance optimization. Ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications: Experience with version control (e.g., Git) and CI/CD processes for data workflows. Familiarity with REST APIs and web scraping best practices. Knowledge of data governance, privacy, and security best practices. Background in the telecommunications or ISP industry is a plus. Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹70,000.00 per month Benefits: Leave encashment Paid sick time Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Work Location: In person

Posted 3 weeks ago

Apply

0 years

0 Lacs

Ahmedabad

On-site

Key Responsibilities: Develop and maintain mobile applications using React Native . Build pixel-perfect, smooth UIs across both mobile platforms. Leverage native APIs for deep integrations with iOS and Android. Diagnose and fix bugs and performance bottlenecks for performance. Maintain code and write automated tests to ensure the product is of the highest quality. Collaborate with Product, UI/UX, and Backend teams to design new features. Stay updated with the latest trends and best practices in mobile app development and React Native ecosystem. Participate in code reviews and contribute to team knowledge sharing. Required Skills and Qualifications: Professional experience in React Native development. Experience with third-party libraries and APIs integration. Solid understanding of mobile app architecture and design patterns . Familiarity with native build tools, like Xcode , Android Studio , Gradle . Experience with state management libraries like Redux, MobX, or Context API. Knowledge of RESTful APIs, JSON parsing, and integrating backend services. Good understanding of app deployment on App Store and Play Store. Familiarity with testing frameworks like Jest , Detox , or similar. Understanding of code versioning tools such as Git .

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Location: Remote Duration: 2 Months Stipend: Performance-based (client conversion) About ATZ CRM ATZ CRM is an AI-powered ATS and CRM solution tailored for recruitment and staffing firms. It offers resume parsing, GPT-driven job descriptions, and smooth integrations via Zapier and Open API. Responsibilities Identify and reach out to potential clients Schedule product demos via email, LinkedIn, and calls Qualify leads and update CRM records Collaborate on outreach strategies and track performance Requirements Clear communication skills Basic familiarity with CRM and outreach tools Proactive, organized, and self-driven Benefits Commission-based earnings for each successful client conversion Internship certificate and recommendation letter Remote flexibility Skill-building in B2B lead generation

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

About Codewalla At Codewalla, we don’t just experiment with AI—we ship AI-native products that scale. Born in New York and growing across India, we partner with ambitious startups to build the next generation of AI-accelerated software. Our work is fast-paced, deeply technical, and relentlessly user-focused. From LLM-powered copilots to vector search-backed dashboards, we bring AI from concept to production—without losing sight of cost, latency, or reliability. We don’t just believe in smart code. We believe in delivering smart experiences. About the Role We’re hiring an AI Applications Developer with 6–8 years of engineering experience —including at least 1 year building and shipping LLM-powered features in production. Your mission: translate raw model capability into lean, reliable, and user-ready features. You’ll work on Model Context Protocol (MCP) servers, build agentic clients, architect RAG pipelines, and automate LLM evaluations to ensure every release delivers measurable value. If you thrive on rapid iteration, prompt experimentation, and seeing your code make it into users’ hands—we’d love to hear from you. What You’ll Work On Build MCP servers and agentic clients that handle user intent parsing, tool orchestration, and structured response generation Architect efficient RAG pipelines with chunk decay, latency budgeting, and cost-aware vector search Automate evaluation pipelines that test LLM outputs for relevance, accuracy, and coherence Work closely with DevOps to codify and deploy infrastructure using CDK or Terraform Set up observability dashboards for prompt performance, latency, and failure traceability Continuously refine prompts, embeddings, and model behavior based on user feedback and regression tests What Makes You a Great Fit 6–8 years of full-stack or backend development experience, with at least 1 year building AI-powered or LLM-based applications AI-native mindset: test fast, trace deeply, pause to reframe when needed Strong Python and TypeScript skills Experience with either AWS or GCP stacks, such as: AWS: Lambda, Bedrock, DynamoDB, OpenSearch Vector Search GCP: Cloud Functions, Vertex AI, Firestore, BigQuery, Vector Search Familiarity with LangChain, Bedrock SDK, and vector database schema design Understanding of prompt design, embeddings, and agentic workflows CI/CD fluency—GitHub Actions, containerized deployment, test-first habits Experience with LLM evaluation tools like Promptfoo, LangSmith, or Guardrails Bonus: Experience with MLflow, LaunchDarkly, Inferentia/GPU tuning Tools & Tech We Work With Languages: Python, TypeScript Frameworks: LangChain, FastAPI, Next.js Cloud: AWS (Bedrock, Lambda, DynamoDB, OpenSearch Vector Search) or GCP (Vertex AI, Cloud Functions, Firestore, BigQuery, Vector Search) Dev Tools: GitHub Copilot, Cursor Evaluation & Safety: Promptfoo, LangSmith, Guardrails DevOps: GitHub Actions, CDK or Terraform, Docker, Prometheus, Grafana Why Join Codewalla? Work at the forefront of AI-native product development Ship features that go from prototype to production, not just to playgrounds Collaborate with world-class teams building real-world tools for global users Influence everything—from prompt strategy to model integration and deployment pipelines Your code will shape actual user experience—not just a research slide deck Inclusion Matters We’re an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all team members. Ready to build the future with LLMs—without waiting for the future to catch up? Apply now and let’s build together.

Posted 3 weeks ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Help Build the AI Engine for a Greener Future Position: AI & Data Intern – Programming, Automation & Sustainability 📍 Location: On-site (Delhi NCR) 🕒 Duration: Minimum 3 months (extendable to 6 months) 💰 Stipend Provided | 📜 Certificate | 🚀 Pre-Placement Offer Opportunity About Novasensa Novasensa is a deep-tech sustainability company working at the cutting edge of lithium-ion battery and e-waste recycling. Our in-house technologies extract valuable metals from used electronics—supporting clean energy, circular economy, and India’s mineral self-reliance. We’re now building a custom AI Assistant to automate experiments, analyze lab data, and support decision-making across our operations. You’ll join our R&D team to help shape this tool and apply your programming skills to real sustainability challenges. Why This Internship Use your Python and data skills on a real-world AI system Learn how AI can accelerate clean-tech innovation Get mentorship from a startup team working at the intersection of sustainability, engineering, and software Gain experience in automation, data visualization, and semantic search Your work will contribute to India’s green growth and circular economy mission What You’ll Work On As an AI & Data Intern , you’ll support development and testing of our internal AI tools. You don’t need to be an expert—we’ll help you learn as you build. Programming & Automation Write Python scripts to clean, organize, and process experimental data Automate repetitive R&D tasks like documentation or calculations Help build backend tools for experiment tracking and analysis Data Analysis & Visualization Analyze lab results and generate visual summaries Use Excel, Python (Pandas, Matplotlib), or Streamlit to build dashboards Track patterns and insights from recycling experiments AI Assistant Development (with guidance) Help with document parsing and semantic search Learn how LangChain, OpenAI, and vector databases (like Pinecone) work Test and improve the performance of our AI assistant using real lab data Who We’re Looking For We’re looking for curious, motivated learners who want to apply their coding and data skills to real sustainability work. Must-Have Basic to intermediate Python programming skills Understanding of data structures, functions, and file handling Familiarity with data visualization (Matplotlib, Seaborn, Excel, etc.) Interest in AI and willingness to learn new tools Good to Have Exposure to AI tools like OpenAI, LangChain, or vector search (even at beginner level) Experience with data cleaning and automation Interest in clean energy, recycling, or sustainability Education Background Candidates should be pursuing or recently completed any of the following degrees: B.Tech / B.E. / M.Tech in: Computer Science Data Science / AI / ML B.Sc / M.Sc in: Computer Science Statistics Mathematics What Past Interns Say“Novasensa is one of the few startups in India where you’re building deep-tech from first principles—with a real mission. I got to code, test, and deploy modules that are now part of the AI system used daily by the team. It changed how I think about software’s role in sustainability.”— Karthik M., BITS Pilani, Summer Intern 2024 How to Apply Email hr@novasensa.com with: • Your resume (PDF) • A short note (max 300 words) telling us: • Why you want to intern at Novasensa • What you’re hoping to learn and contribute 🗓 Applications are reviewed on a rolling basis. Early applicants are preferred. 🌍 Learn. Build. Impact. Join us to use your code for a cleaner planet.

Posted 3 weeks ago

Apply

7.0 - 8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

.Net lead requirement for ITC Infotech- Kolkata. Job Location- Kolkata 5days work form office Mode of interview- 1st round through MS team and 2nd round Face to face discussion. Required candidates who can join within 30days. Experience : 7 to 8 Years Required Skills Programming skills: .NET Core (C#), Angular, React, JavaScript, and SQL Server. Data formats: JSON and XML data formats, including parsing, serialization, and deserialization. Integration experience: Experience with integrating manufacturing systems (MES, SCADA, etc.) with enterprise applications (SAP, ERP, etc.) is a plus but not mandatory ETL tools: Experience with ETL tools such as Microsoft SSIS, Talend, or Apache NiFi. SAP integration: Experience integrating with SAP systems using SAP PI, SAP PO, or other integration technologies. Database skills: Strong understanding of database design, data modelling, and SQL Server. Deployment and debugging: Experience with deploying and debugging bespoke applications. Key Responsibilities Design and develop integrations: Between manufacturing systems (MES, SCADA, etc.) and enterprise applications (SAP, ERP, etc.) using .NET & .NET Core, JavaScript, JSON, XML, and SQL Server. Bespoke Development: Design, develop, test, deploy & debug custom solutions using .NET Core to meet specific business requirements Deployment: Deploy bespoke application to production environments, ensuring smooth transition and minimal downtime. Debugging: Troubleshoot and debug to identify and resolve issues in applications SAP integration: Experience integrating with SAP systems using SAP PI, SAP PO, or other integration technologies. Integrate with ETL tools: Experience with ETL tools such as Microsoft SSIS Apache Nifi, or Talend. Data modelling and database design: Design and implement data models and database schemas to support integration requirements. Troubleshooting and support: Provide technical support and troubleshooting for integration issues. Collaborate with stakeholders: including manufacturing teams, IT teams, and external partners, to ensure successful project delivery. Nice to Have Skills MES and SCADA systems: Knowledge of Manufacturing Execution Systems (MES) and Supervisory Control and Data Acquisition (SCADA) systems. OPC applications: Experience with OPC (Open Platform Communications) applications and protocols. Cloud computing platforms: Experience with cloud computing platforms such as AWS, Azure, Industrial IoT: Knowledge of Industrial IoT (IIoT) concepts and technologies.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Malappuram, Kerala

On-site

Flutter Developer Intern Company: Cookee Apps LLP Location: On-site (Kozhikode, Kerala, India) Job Type: Internship (Full-time, 6 Months) Schedule: Day shift About Us Cookee Apps LLP is a fast-growing software company that builds innovative web and mobile solutions. We’re passionate about mentoring fresh talent through real-world, hands-on training and support. Position Overview We are seeking a proactive and enthusiastic Flutter Developer Intern for a 6-month, full-time internship. You will gain practical experience building cross‑platform mobile applications using Flutter and Dart, working alongside our front-end, back-end, and UI/UX teams. Key Responsibilities Contribute to the development of mobile apps using Flutter and Dart (expertia.ai). Collaborate with design and backend teams to implement responsive UI/UX and integrate RESTful APIs . Write clean, maintainable, and efficient code. Participate in code reviews, troubleshooting, and bug-fixing to improve app stability and performance (in.indeed.com). Assist in writing unit tests and contribute to documentation. Stay updated with emerging mobile technologies and flutter best practices. Required Skills Strong fundamentals in Dart and Flutter development (expertia.ai, expertia.ai). Basic understanding of mobile development concepts (UI frameworks, state management, navigation). Familiarity with RESTful API integration and JSON parsing (expertia.ai). Proficiency with Git version control. Solid problem-solving abilities and attention to detail. Strong communication skills and collaborative mindset. Preferred Qualifications Pursuing or completed a degree/certification in Computer Science, Software Engineering, or related field. Portfolio or GitHub showcasing Flutter/Dart projects (academic, personal, or hackathon). Experience using state management solutions (e.g. Provider, BLoC, GetX). Exposure to unit testing in Flutter, CI/CD pipelines, or Firebase integration. What We Offer Internship Certificate upon successful completion. Letter of Recommendation for outstanding performers. Real-time exposure to industry-level codebases and agile development processes. Mentorship from senior developers and the possibility of a full-time role post-internship. Duration & Schedule 6 months full-time commitment Day shift , On-site at Kozhikode, Kerala How to Apply Submit your resume , GitHub portfolio , and a brief statement of interest to career@cookee.io Job Type: Internship Schedule: Day shift Work Location: In person

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Title : Sr. Manager Function : Lab Informatics - IT Location : Hyderabad preferrable, Bangalore At VIATRIS, we see healthcare not as it is but as it should be. We act courageously and are uniquely positioned to be a source of stability in a world of evolving healthcare needs. Viatris empowers people worldwide to live healthier at every stage of life. We do so via: Access – Providing high quality trusted medicines regardless of geography or circumstance; Leadership – Advancing sustainable operations and innovative solutions to improve patient health; and Partnership – Leveraging our collective expertise to connect people to products and services. Every day, we rise to the challenge to make a difference and here’s how the Lab Informatics role, will make an impact: Role Purpose To implement, configure and enhance Lab Ware LIMS usage in Viatris and support users who are using LIMS in their labs across Viatris. Key Responsibilities Installs/Configures Labware LIMS modules and templates related to analysis, testing, product spec/characteristics, etc. in the system and Incorporates required lab processes and controls in LIMS as assigned and approved to do Effects changes to Database tables and fields, templates, scripts where necessary through a change management process Write/update/review independently subroutines/programs, query tags, user dialogs, visual workflows, Crystal Reports, menu routines, LIMS Basic codes, etc. Could able to configure application interface with SAP, Empower/Chromeleon, instruments (Direct & file based) with lab station/parsing script function Implements Method execution/Experiment templates functionality in Labs with ability to write macros Able to understand labtracks and install bug fixes and perform necessary testing before deployment in higher instances Handles escalations related to product functionalities which could not be solved by designated Lab SMEs or Run team Play his assigned technical role effectively during new project implementations, roll-out’s, enhancements, etc. Thinks proactively where improvements are needed in the system, brings them for discussion with Leads and follow through till effective Takes care of improvement or support needs w.r.t Instrument interface and Application interfaces other software. Supports Lab teams during Regulatory audits for any technical help/explanations when necessary Works with Infrastructure teams in ensuring adequate hardware design and provisioning for smooth LIMS software functioning Ensure the system is properly validated and continues to be in a validated condition with a very good understanding of Computer System validation and GxP processes, working closely with CSV professionals and functional reviewers Determines Master Data requirements and guide/hand-hold Master-data team members in in aspects. Ensures appropriate User Management/Admin rights provisioned in the system Hand-holds Site SME’s in LIMS operations and implementations through proper training and support Plays the role of a technical member effectively during implementation of projects Supervisory/Management Responsibilities IC role Qualification and Experience BE/B.Tech/ME/M.Sc Min 10 of Experience in LabWare LIMS build or implementation with validation knowledge as per GAMP

Posted 3 weeks ago

Apply

8.0 years

30 - 35 Lacs

India

Remote

Job Title: AI Architect – Discovery Phase Lead Experience: 8+ Years Location: Remote Employment Type: Contract Domain: Financial Services / Enterprise AI Position Overview We are looking for a visionary AI Architect to lead the discovery and technical planning phase of a high-scale, next-generation AI platform designed for the financial services sector. This role involves deep engagement with stakeholders, crafting a modular, future-proof AI architecture, and translating regulatory-driven business requirements into an actionable roadmap. You will play a critical role in validating technical feasibility, cost efficiency, and long-term scalability for a platform aimed at 100K+ users. Key Responsibilities 🔍 Discovery Phase Leadership Engage directly with clients and stakeholders to identify use cases across semantic search, document processing, predictive analytics, and agentic workflows. Translate business and regulatory requirements (e.g., 100% accuracy for financial compliance) into detailed technical specifications. Design modular, vendor-neutral architecture aligned with long-term strategic flexibility. Conduct cost-benefit analysis and provide clear ROI justifications against alternatives like Salesforce. 🏗️ Technical Architecture Design Architect a hybrid AI stack using DSPy, LangGraph, PromptFlow, and Azure AI Services. Ensure scalability and cost-efficiency for 100K+ user base, while mitigating vendor lock-in. Evaluate and recommend AI frameworks, ensuring seamless integration with existing enterprise ecosystems. Lead prototype development and technical feasibility studies to de-risk downstream builds. 📈 Strategic Planning and Roadmapping Deliver a phased implementation roadmap from pilot to full production. Propose a future evolution plan to prevent costly reengineering. Define integration points for CRM, data lakes, compliance systems, and other internal platforms. Collaborate with enterprise architects and DevOps to ensure system readiness. Technical Skills & Required Expertise 🧠 AI/ML Technologies Strong expertise in DSPy (prompt optimization, few-shot learning) Hands-on with LangGraph (multi-agent orchestration) Proficiency in Azure AI & PromptFlow Deep understanding of RAG architectures and semantic search systems ☁️ Cloud & Infrastructure Hands-on experience in Azure Cognitive Services, AI Foundry, and Microservices Strong knowledge of REST APIs, cloud cost optimization, and monitoring Experience designing for enterprise scale (50K+ to 100K+ users) 🧾 Domain Knowledge (Preferred) Familiarity with financial services regulations, audit trails, and compliance architecture Experience in parsing legal documents (e.g., LPAs, fund documents) Understanding of investment modeling and sentiment analytics Required Experience 8+ years in AI/ML architecture roles, with enterprise-scale system delivery Proven experience leading discovery phases, building roadmaps, and creating technical artifacts Prior experience designing hybrid AI platforms with open frameworks (e.g., DSPy, LangChain, LangGraph, PromptFlow) Hands-on experience developing proof-of-concepts and prototypes Soft Skills & Leadership Attributes Ability to present technical strategy to C-level stakeholders Expert in requirement gathering, solution mapping, and stakeholder management Strong technical writing skills and documentation ownership Confident in client interactions, including mock discovery and technical assessments Preferred Qualifications MS/PhD in Computer Science, AI/ML, or related field Publications or patents in enterprise AI architecture Speaker at AI industry conferences Contributions to open-source AI tooling Skills: azure ai services,analytics,enterprise,ai/ml technologies,compliance,cloud,architecture,financial services,azure,promptflow,azure cognitive services,ml,dspy,ai foundry,rest apis,langgraph,microservices

Posted 3 weeks ago

Apply

3.0 years

4 - 6 Lacs

Ahmedabad, Gujarat, India

On-site

Role Purpose & Context The Senior Account Executive in invoice management is responsible for managing and overseeing daily operational activities, ensuring accuracy and timeliness of Invoice production and reconciliation, handling roaming partners, Internal peers, client interactions and supporting key Operational processes such as invoicing, reconciliation,Allocation, Agreements, CNDN adjustments, reporting and other operational support. The role also involves mentoring junior operations staff and contributing towards process efficiencies. Key Responsibilities Managing the end-to-end process of data loading and invoice generation for GSM and SMS data, ensuring completion within deadlines. This includes updating tracker sheets for missing roaming agreements and maintaining the data parsing sheet during invoice preparation Conducting thorough checks on error logs and performing sanity checks on GSM and SMS data received from DCH/clients, including identifying duplicate TAP file billing, hub invoices, and invoices with RAP or discounts Creating and updating issue logs and communicating with relevant team members to address and resolve queries Allocating inward receipts daily and taking appropriate actions in accordance with defined KPIs. This includes following up on missing payment notifications and missing invoices Uploading and reconciling all types of incoming payable invoices, addressing all associated queries efficiently Escalating complex queries to account managers, clients, or partners for timely resolution Performing all responsibilities of an account handler, such as raising credit note applications, validating inward credit notes, following up on missing roaming agreements, and resolving issues related to unlisted roaming partners or absence of payable data Handling ticketing system for relevant process to ensure prompt action as per requirement Providing support in testing activities and preparation of process documentation Collaborate with cross-functional teams to ensure smooth operations of all the process Requirements Very good communication skills Methodical and thorough working style Detailed way of working Strong team worker Creative and innovative way of thinking Good verbal and written communication in English Experience in Accounts Receivable and Payables Management MS office A University degree is desirable, Accounting background will be an advantage 3 years prior work experience and ideally gained in finance or administration Benefits Health Insurance Provident Fund, Gratuity 5 days working (Monday-Friday) Employee Engagement activities in a Quarter

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

Remote

Were Hiring | DevSecOps Engineer Location : Remote (India) Urgent Requirement Quick Closures Expected! Were on the lookout for a passionate and skilled DevSecOps Engineer / Security Analyst with 4+ years of experience for a leading publishing company. If you have expertise in cloud security, incident response, security automation, and scripting, this role is for you! Security Engineer/DevSecOps/ Experience : 4+ years Security Analyst (SOC) Security Automation for a Publishing Company Good Understanding of code security and web application security or systems like infra security Windows and Linux. Proven and Demonstrated passion for cyber security with at least 5+ Years of relevant experience. Good understanding of security operations, network security, threat intelligence, and incident response. SIEM configuration (particularly Qradar). Incident and alarm response procedures, engagement with operations teams to manage incidents. Experience/ Understanding of Cloud-based services (AWS), technologies, and providers (e.g., SaaS, IaaS, PaaS, etc.) Experience with writing queries, parsing, and correlating data. Technical understanding of PaloAlto, firewall, IDS/IPS, and Wildfire features The ability to perform analysis of log files from multiple devices and environments, and identify indicators of security threats. Strong understanding of parsing and analyzing web, system, and security logs Strong technical knowledge across a range of server and gateway platforms, including Linux/Unix/Windows/Mac Demonstrable knowledge of scripting/programming tools such as PowerShell, Python Understanding of VPN infrastructure, 2FA like Okta Deep understanding of network protocols and security: TCP/IP, UDP, DHCP, FTP, SFTP, SNMP, SMTP, SSH, SSL, VPN, RDP, HTTP, and HTTPS. Familiar with YARA, STIX, TAXII, and OpenIOC for any threat intelligence. Excellent verbal and written communication skills; ability to articulate technical knowledge to non-technical audiences; production of policy/standards/project documentation Knowledge of data leakage prevention tools DLP/CASB/Web security is an add on Having a certification background in any one of GCIH, GCIA, GPEN, OSCP or other relevant certifications within Cyber Security is highly advantageous. VM scanning Qualys is good to have. Experience in handling phishing attacks using Proofpoint, CLEAR, TRAP, and TAP. Experience in EDR solutions, simulating setups like kali-Linux. Experience in Web security CDN Cloudflare/Akamai/Cloudfront or any WAF. (ref:hirist.tech)

Posted 3 weeks ago

Apply

23.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title : App Developer Location : Bengaluru, India Company Overview At IAI Solutions (https : //www.iaisolution.com/) , we create innovative software solutions that transform how businesses operate. We are seeking a skilled Mobile App Developer with a passion for building robust, high-performance cross-platform applications. If you have hands-on experience in Flutter and Dart and enjoy solving complex problems in real-time audio/video scenarios, we want you on our team. Position Summary We are looking for a Mobile App Developer with 23 years of professional mobile app development experience - at least 12 years of which must be dedicated to Flutter and Dart for cross-platform applications. The ideal candidate will be well-versed in state management using Riverpod (or similar frameworks), comfortable integrating WebRTC for real-time audio/video, and capable of handling end-to-end mobile app lifecycles (from UI design to App Store/Play Store submission). You should enjoy working on streaming/chat-style features, be adept at managing permissions and device integration (camera, microphone), and have a keen eye for responsive UI/UX design in Flutter. Key Responsibilities Develop, maintain, and optimize cross-platform mobile applications using Flutter and Dart, ensuring clean architecture and modular code. Implement and manage complex app state using Riverpod (e.g., StateProvider, FutureProvider, AsyncNotifier) or similar frameworks (Provider, Bloc, GetX). Integrate WebRTC (via flutter_webrtc) for real-time audio/video functionality, including peer-to-peer streaming, signaling (Firebase/WebSocket), and handling STUN/TURN servers. Build responsive, user-friendly interfaces in Flutter, designing video grids, call controls, and other UI components that adapt seamlessly across various screen sizes. Handle real-time UI updates for dynamic states such as mute/unmute, participant joins/leaves, and other call-related events. Configure and debug iOS builds in Xcode and Android builds in Android Studio, including provisioning profiles, signing certificates, and deployment pipelines. Manage App Store (iOS) and Play Store (Android) submission processes : prepare provisioning profiles, handle versioning, and ensure compliance with store guidelines. Integrate RESTful APIs and/or WebSocket connections for server communication, handling JSON payloads and ensuring efficient data parsing and error handling. Manage device permissions (camera, microphone, network) using packages like permission_handler, ensuring a smooth user experience and handling permission-related edge cases. Write unit tests and widget tests for UI components and business logic; debug cross-platform and WebRTC issues in both Xcode and Android Studio. Implement basic security measures and privacy compliance (e.g., encryption for media streams, GDPR/CCPA requirements) to protect user data and adhere to regulatory standards. Collaborate closely with designers, backend engineers, and QA teams to define requirements, review code, and deliver end-to-end features on schedule. Stay updated with the latest Flutter/Dart releases, WebRTC enhancements, and mobile-app-related best practices; share knowledge and mentor junior developers as needed. Assist in troubleshooting live production issues, perform root-cause analysis, and release timely hotfixes or updates. Qualifications Overall Experience : 23 years of professional mobile app development. Flutter & Dart : 12 years of dedicated experience building cross-platform apps. State Management : Minimum 1 year using Riverpod (or Provider, Bloc) for complex app states and handling real-time UI updates (e.g., in chat/streaming apps). WebRTC : 612 months of hands-on experience integrating flutter_webrtc or equivalent; familiarity with signaling mechanisms (Firebase, WebSocket). Cross-Platform Development : At least 1 year each working with : iOS : Configuring Xcode, provisioning, signing, and debugging. Android : Configuring Android Studio, managing Gradle, signing, and debugging. App Store/Play Store : Experience in app submission workflows, including provisioning profiles (iOS) and signing/build configurations (Android). Networking & APIs : 12 years integrating REST APIs or WebSocket in mobile apps; strong understanding of JSON UI/UX Design : 1+ years building responsive, user-friendly interfaces in Flutter (e.g., grids, custom controls). Permissions & Device Integration : 1+ years working with device permissions (camera, mic, network) and integrating native plugins. Testing & Debugging : 1+ years writing unit/widget tests; debugging cross-platform issues, especially around WebRTC, in Xcode and Android Studio. Security & Privacy : 612 months implementing encryption for media streams and ensuring compliance with data privacy standards (e.g., GDPR, CCPA). Must-Have Skills Proficient in Flutter and Dart for cross-platform UI and business logic Expert in state management with Riverpod and familiar with Provider, Bloc, or GetX Experienced integrating flutter_webrtc for real-time audio/video streaming Skilled in implementing signaling using Firebase or custom WebSocket Proficient with Xcode for iOS build configuration, provisioning, and debugging Proficient with Android Studio for Gradle management, signing, and debugging Knowledgeable about App Store and Play Store submission requirements Experienced in integrating REST (HTTP/JSON) and WebSocket APIs Skilled in designing responsive video conferencing UIs (video grids, call controls) in Flutter Competent in managing camera, microphone, and network permissions using permission_handler Able to write unit and widget tests for UI and logic in Flutter Experienced in debugging WebRTC and platform-specific issues in Xcode and Android Studio Familiar with implementing basic encryption for media streams Knowledgeable about GDPR and CCPA compliance requirements Good-to-Have Skills Advanced WebRTC optimization (STUN/TURN, SFU/MCU) Backend development (Node.js, Firebase, AWS) Push notifications (FCM, CallKit for iOS, Android call screens) Performance optimization (app size, CPU, platform channels) Advanced UI/UX (virtual backgrounds, screen sharing, animations) Analytics and monitoring (Sentry, Firebase Analytics, WebRTC metrics) CI/CD And DevOps Pipelines with GitHub Actions/Bitrise Streamlined App Store and Play Store deployments Accessibility and localization (i18n, RTL, screen reader support) Native development (Swift/Kotlin for platform-specific features) AI/ML integration (noise cancellation, live captions, face detection) Preferred Qualifications Bachelors degree in Computer Science, Engineering, or a related field. Strong Git workflow experience (feature branches, code reviews, pull requests). Demonstrated ability to mentor junior developers and conduct code reviews. Proven track record of delivering at least one production-grade Flutter app to both App Store and Play Store. Familiarity with Agile/Scrum methodologies and collaborative tools (Jira, & Benefits : Competitive salary with performance-based bonuses. Opportunity to work on cutting-edge real-time audio/video applications. Flexible working hours. Access to the latest development tools and technologies. Professional development budgets for conferences, courses, and certifications. Wellness programs, and team-building events. (ref:hirist.tech)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies