Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
35.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Why Choose Bottomline? Are you ready to transform the way businesses pay and get paid? Bottomline is a global leader in business payments and cash management, with over 35 years of experience and moving more than $16 trillion in payments annually. We're looking for passionate individuals to join our team and help drive impactful results for our customers. If you're dedicated to delighting customers and promoting growth and innovation - we want you on our team! We are looking for a Production Support Engineer to innovate, win, and grow with us! Bottomline Technologies Digital Banking and Business Solutions division serves a variety of top-tier banks and financial institutions. We provide our customers with web-based software for online banking, payments management, cash management, treasury, and accounts payable automation. We are seeking an incredible new team member to provide support and ensure customer satisfaction. We are looking for an expert at troubleshooting complex customer production problems in a variety of environments, including Bottomline hosted application environments. You should have experience serving as a senior member of a technical production support team, are a master problem solver and have a propensity for driving problems to resolution with an eye on creating the best outcomes for our customers and our business. If you answer YES to the below questions, we would love to speak with you! Do you have strong troubleshooting skills and excel at problem solving? Do you enjoy learning and digging to know more? Do you want to work directly with customers, helping to delight and ensure excellent communication? Do you enjoy working cooperatively across teams to gather issue information and drive to resolution? Are you comfortable speaking to a crowd? Do you have the determination and desire to achieve the following: Perform in production support, troubleshooting and maintenance. Work with Development and Implementation engineering teams on incidents and defects. Serve as a resource for operational processes and technology, supporting the platform and customer base. Manage queries in accordance with a structured case management discipline to achieve the required Service Level Agreements. Develop and maintain SQL for troubleshooting and monitoring. Perform log analysis to assist with the location of root cause. On call rotation schedule for out-of-hours support. Occasional “after-hours” high-severity issue support. What Will Make You Successful Bachelor’s degree, preferably Computer Science degree, and 2-3 years prior experience in a Production Support role. Demonstrated working knowledge of Java Based Web Applications. Previous experience in a customer-facing role, including excellent written and oral communications skills. Outstanding analytical and triage skills to review application logs to identify root cause. Ability to prioritize effectively and handle shifting priorities professionally. Participation in the complete life cycle of an enterprise-wide, web-based application. Good understanding of Networking principles. Databases and SQL query language. Unix/Linux bash command skills. Advantageous Skills: ELK, Grafana, Rest API calls, Postman, Git, GitLab, Bitbucket, PyCharm, Oracle WebLogic, IBM WebSphere, Tomcat, Apache We welcome talent at all career stages and are dedicated to understanding and supporting additional needs. We're proud to be an equal opportunity employer, committed to creating an inclusive and open environment for everyone. Show more Show less
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Skills-Hadoop-Spark Looking for senior pyspark developer with 6+ years of hands on experience Build and manage large scale data solutions using tools like Pyspark, Hadoop, Hive, Python & SQL Create workflows to process data using IBM TWS Able to use pyspark to create different reports and handle large datasets Use HQL/ SQL/Hive for ad-hoc query data and generate reports, and store data in HDFS Able to deploy code using Bitbucket, Pycharm and Teamcity. Can manage folks, able to communicate with several teams and can explain problem/solutions to business team in non-tech manner. Primary Skill : Pyspark------Hadoop-Spark .
Posted 1 month ago
0 years
3 - 7 Lacs
Hyderābād
On-site
About Harri: harri is the frontline employee experience platform built for companies who have service at the heart of their business. The solution is built on the notion that the customer experience will never exceed the employee experience. The Harri suite of talent attraction, workforce management and employee engagement technologies enable organizations to attract, manage, engage and retain the best talent for their business. Hospitality is in our DNA, with most of our global team having front line and management restaurant experience - we are changing the landscape of our industry and frontline workers technology. We need the very best and brightest to join us on this mission to disrupt the market as it stands today. Based in NYC, Harri has global offices in the UK, Palestine and India and has been awarded: Top 50 Startup by LinkedIn, Best Enterprise Solution for HR/Workforce by HR Tech Awards & NYC Best Tech Startup for the Tech in Motion Events Timmy Awards. If you’re a builder, or problem solver, and love the fast pace of a startup, it’s time to meet the Harri family. Who you are: We are seeking an experienced Lead Backend Engineer.. In this role, you will lead and mentor a team of engineers technical leadership, and Collaborate effectively with cross-functional teams, including product managers, frontend engineers, and QA, to define, develop, and deploy new features and enhancements. Position description The Lead Backend Engineer is responsible for designing, implementing, and maintaining scalable and efficient backend systems with a focus on performance, security, and reliability. The role requires expertise in Python, Django, databases (SQL & NoSQL), API development, and cloud services (AWS). Role and Responsibilities Duties and responsibilities for a Lead Backend Engineer position in our India team include, but are not limited to: Write clean, modular, reusable, testable, and well-documented code that adheres to our coding standards and promotes maintainability. Proficient in writing unit tests and performing integration testing to ensure code reliability, maintainability, and seamless interaction between components Troubleshoot and debug complex production issues, identifying root causes and implementing effective solutions in a timely manner. Design, implement, and maintain scalable, efficient, and secure backend systems, with a focus on performance and reliability for our global user base. Maintain alignment with HARRI’s global team(s) coding and design standards, ensuring consistency and interoperability. Demonstrate the ability to deliver high-quality work within agreed timelines. Proactively identify and implement optimizations to enhance system performance, ensuring high availability and responsiveness under varying loads. Architect and implement robust security structures and design efficient and scalable data storage solutions. Lead and mentor a team of engineers by providing technical guidance and support, including managing day-to-day activities, task assignments and deliverables. Take ownership of team deliverables, ensuring high quality and timely execution. Participate actively in the team expansion efforts, including sourcing, screening, and interviewing potential engineer candidates. Contribute to the onboarding and orientation of new team members. Provide regular progress updates to your line manager, highlighting achievements against established goals and key performance indicators (KPIs), challenges encountered, and potential roadblocks that may impact timelines or objectives. Collaborate effectively with cross-functional teams, including product managers, frontend engineers, and QA, to define, develop, and deploy new features and enhancements. Actively participate in knowledge sharing sessions, code reviews, and other team activities to foster a strong collaborative culture and contribute to the growth of team members. Stay current with relevant backend technologies, tools, and trends. Propose and drive the adoption of beneficial innovations, including AI. Qualifications: Bachelor's or Master's degree in Computer Science or a related field. Strong knowledge of relational databases and SQL, including: Proficiency in database design principles and best practices. Demonstrated ability to write and optimize complex SQL queries and stored procedures. Experience with NoSQL databases like DynamoDB or MongoDB is highly desirable. Experience in Python development Experience with Python web frameworks, specifically Django. Experience with Python Object-Relational Mappers (ORMs) such as Django ORM and SQLAlchemy. Excellent grasp of data structures, algorithms, and Object-Oriented Programming (OOP) principles. Proven experience in designing and implementing RESTful APIs, Graph, gRPC and Sockets. Proficient with Git for version control and collaborative development workflows especially GitHub. Hands-on experience with AI tools, IDEs, prompts, and protocols such as Cursor, Copilot agent mode with VSCode, Pycharm Copilot, MCP servers, and Copilot on Github.com Exceptional problem-solving and analytical abilities, with a proactive approach to identifying and resolving issues. Solid understanding of the Software Development Life Cycle (SDLC) and agile methodologies. Experience working effectively in Agile development environments (e.g., Scrum, Kanban), including SAFe. Familiarity with JIRA tracking and project management tools, including defect lifecycle management. Knowledge of shell scripting (e.g., bash) is a plus. Experience with Service-Oriented Architecture (SOA) and microservices architectural patterns and best practices is a significant plus. Practical experience working with Amazon Web Services (AWS) and its core services. Experience with Continuous Integration/Continuous Deployment (CI/CD) tools and pipelines. Demonstrated experience in leading and mentoring software engineers. Experience in the hiring process, including sourcing and interviewing candidates. Excellent verbal and written English communication skills, with the ability to articulate technical concepts clearly and concisely. Strong interpersonal and collaboration skills, with a proven ability to work effectively within a team. INDFOH
Posted 1 month ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description Job Title: AI Engineer Salary: 4 - 5.4 LPA Experience: Minimum 2 years Location: Hinjewadi, Pune Work Mode: Work from Office Availability: Immediate Joiner About Us: Rasta.AI, a product of AI Unika Technologies (P) Ltd, is a pioneering technology company based in Pune. We specialize in road infrastructure monitoring and maintenance using cutting-edge AI, computer vision, and 360-degree imaging. Our platform delivers real-time insights into road conditions to improve safety, efficiency, and sustainability. We collaborate with government agencies, private enterprises, and citizens to enhance road management through innovative tools and solutions. The Role This is a full-time, on-site role. As an AI Engineer, you will be responsible for developing innovative AI models and software solutions to address real-world challenges. You will collaborate with cross-functional teams to identify business opportunities and provide customized solutions. You will also work alongside talented engineers, designers, and data scientists to implement and maintain these models and solutions. Technical Skills Programming Languages: Python (and other AI-supported languages) Databases: SQL, Cassandra, MongoDB Python Libraries: NumPy, Pandas, Scikit-learn Deep Neural Networks: CNN, RNN, and LLM Data Analysis Libraries: TensorFlow, Pandas, NumPy, Scikit-learn, Matplotlib, Tensor Board Frameworks: Django, Flask, Pyramid, and Cherrypie Operating Systems: Ubuntu, Windows Tools: Jupyter Notebook, PyCharm IDE, Excel, Roboflow Big Data (Bonus): Hadoop (Hive, Sqoop, Flume), Kafka, Spark Code Repository Tools: Git, GitHub DevOps-AWS: Docker, Kubernetes, Instance hosting and management Analytical Skills Exploratory Data Analysis Predictive Modeling Text Mining Natural Language Processing Machine Learning Image Processing Object Detection Instance Segmentation Deep Learning DevOps AWS Knowledge Expertise Proficiency in TensorFlow library with RNN and CNN Familiarity with pre-trained models like VGG-16, ResNet-50, and Mobile Net Knowledge of Spark Core, Spark SQL, Spark Streaming, Cassandra, and Kafka Designing and Architecting Hadoop Applications Experience with chat-bot platforms (a bonus) Responsibilities The entire lifecycle of model development: Data Collection and Preprocessing Model Development Model Training Model Testing Model Validation Deployment and Maintenance Collaboration and Communication Qualifications Bachelor's or Master's degree in a relevant field (AI, Data Science, Computer Science, etc.) Minimum 2 years of experience developing and deploying AI-based software products Strong programming skills in Python (and potentially C++ or Java) Experience with machine learning libraries (TensorFlow, PyTorch, Kera's, scikit-learn) Experience with computer vision, natural language processing, or recommendation systems Experience with cloud computing platforms (Google Cloud, AWS) Problem-solving skills Excellent communication and presentation skills Experience with data infrastructure and tools (SQL, NoSQL, and big data platforms) Teamwork skills Join Us! If you are passionate about AI and want to contribute to groundbreaking projects in a dynamic startup environment, we encourage you to apply! Be part of our mission to drive technological advancement in India. Drop Your CV - hr@aiunika.com Show more Show less
Posted 2 months ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : PySpark Developer Locations : Hyderabad & Bangalore Work Mode : Hybrid Interview Mode : Virtual (2 Rounds) Type : Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Show more Show less
Posted 2 months ago
4.0 years
5 - 10 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Software engineering is the application of engineering to the design, development, implementation, testing and maintenance of software in a systematic method. The roles in this function will cover all primary development activity across all technology functions that ensure we deliver code with high quality for our applications, products and services and to understand customer needs and to develop product roadmaps. These roles include, but are not limited to analysis, design, coding, engineering, testing, debugging, standards, methods, tools analysis, documentation, research and development, maintenance, new development, operations and delivery. With every role in the company, each position has a requirement for building quality into every output. This also includes evaluating new tools, new techniques, strategies; Automation of common tasks; build of common utilities to drive organizational efficiency with a passion around technology and solutions and influence of thought and leadership on future capabilities and opportunities to apply technology in new and innovative ways. Primary Responsibilities: Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent experience 4+ years of experience with Development role Hands-on experience in the following: data validation, writing unit test cases Hands-on project experience on Jupyter Notebook/ Zeppelin/ PyCharm etc. IDEs Experience in integrating PySpark with downstream and upstream applications through a batch/real-time interface Experience in QA / Testing concepts and processes Experience in cloud Azure experience is preferrable to handle infrastructure scalability Experience in server management (Kafka ,Kubernetes , Docker, tomcat etc) Experience in fine-tuning process and troubleshooting performance issues Good knowledge of Hadoop, Hive, and Cloudera/ Hortonworks Data Platform Knowledge on Big Data and ETL Knowledge of Linux Familiarity with CI/CD pipeline tools (Jenkins, Bamboo, Maven/Gradle, SonarQube, Git) for deployment process Solid SQL skills , JAVA, PYTHON, NOSQL Solid troubleshooting & debugging skills (security, monitoring, server load, networking) Understand and have operating experience with Agile delivery methodologies Proven ability to write automation scripting using bash, Python Proven ability to write synchronous and asynchronous APIs in JAVA Proven ability to build and run SQL queries to extract data and compare reports and compare data across multiple reports for consistency Proven ability in managing ServiceNow tickets assigned to the team Proven ability in maintaining and managing Linux servers, including OS and application patching and upgrades Proven analytical skills Preferred Qualification: HealthCare domain knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 months ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Software engineering is the application of engineering to the design, development, implementation, testing and maintenance of software in a systematic method. The roles in this function will cover all primary development activity across all technology functions that ensure we deliver code with high quality for our applications, products and services and to understand customer needs and to develop product roadmaps. These roles include, but are not limited to analysis, design, coding, engineering, testing, debugging, standards, methods, tools analysis, documentation, research and development, maintenance, new development, operations and delivery. With every role in the company, each position has a requirement for building quality into every output. This also includes evaluating new tools, new techniques, strategies; Automation of common tasks; build of common utilities to drive organizational efficiency with a passion around technology and solutions and influence of thought and leadership on future capabilities and opportunities to apply technology in new and innovative ways. Primary Responsibilities Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 4+ years of experience with Development role Hands-on experience in the following: data validation, writing unit test cases Hands-on project experience on Jupyter Notebook/ Zeppelin/ PyCharm etc. IDEs Experience in integrating PySpark with downstream and upstream applications through a batch/real-time interface Experience in QA / Testing concepts and processes Experience in cloud Azure experience is preferrable to handle infrastructure scalability Experience in server management (Kafka ,Kubernetes , Docker, tomcat etc) Experience in fine-tuning process and troubleshooting performance issues Good knowledge of Hadoop, Hive, and Cloudera/ Hortonworks Data Platform Knowledge on Big Data and ETL Knowledge of Linux Familiarity with CI/CD pipeline tools (Jenkins, Bamboo, Maven/Gradle, SonarQube, Git) for deployment process Solid SQL skills , JAVA, PYTHON, NOSQL Solid troubleshooting & debugging skills (security, monitoring, server load, networking) Understand and have operating experience with Agile delivery methodologies Proven ability to write automation scripting using bash, Python Proven ability to write synchronous and asynchronous APIs in JAVA Proven ability to build and run SQL queries to extract data and compare reports and compare data across multiple reports for consistency Proven ability in managing ServiceNow tickets assigned to the team Proven ability in maintaining and managing Linux servers, including OS and application patching and upgrades Proven analytical skills Preferred Qualification HealthCare domain knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 2 months ago
5.0 years
7 - 10 Lacs
Bengaluru
On-site
If you run toward knowledge and problem-solving, join us About NetApp NetApp is the intelligent data infrastructure company, turning a world of disruption into opportunity for every customer. No matter the data type, workload or environment, we help our customers identify and realize new business possibilities. And it all starts with our people. If this sounds like something you want to be part of, NetApp is the place for you. You can help bring new ideas to life, approaching each challenge with fresh eyes. Of course, you won't be doing it alone. At NetApp, we're all about asking for help when we need it, collaborating with others, and partnering across the organization - and beyond. Data Scientist Bengaluru, India Job category: Information Technology Job ID: 130719-en_US Job summary We are looking for a talented Data Scientist to join our team. The ideal candidate will have a strong foundation in data analysis, statistical models, and machine learning algorithms. You will work closely with the team to solve complex problems and drive business decisions using data. This role requires strategic thinking, problem-solving skills, and a passion for data. Job responsibilities Analyse large, complex datasets to extract insights and determine appropriate techniques to use. Build predictive models, machine learning algorithms and conduct A/B tests to assess the effectiveness of models. Present information using data visualization techniques. Collaborate with different teams (e.g., product development, marketing) and stakeholders to understand business needs and devise possible solutions. Stay updated with the latest technology trends in data science. Develop and implement real-time machine learning models for various projects. Engage with clients and consultants to gather and understand project requirements and expectations. Write well-structured, detailed, and compute-efficient code in Python to facilitate data analysis and model development. Utilize IDEs such as Jupyter Notebook, Spyder, and PyCharm for coding and model development. agile methodology in project execution, participating in sprints, stand-ups, and retrospectives to enhance team collaboration and efficiency. Education IC - Typically requires a minimum of 5 years of related experience.Mgr & Exec - Typically requires a minimum of 3 years of related experience. At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees. This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process. Equal Opportunity Employer: NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification. Why NetApp? We are all about helping customers turn challenges into business opportunity. It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better - but also to innovate. We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches. We enable a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations. We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future. We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life. If you want to help us build knowledge and solve big problems, let's talk. Submitting an application To ensure a streamlined and fair hiring process for all candidates, our team only reviews applications submitted through our company website. This practice allows us to track, assess, and respond to applicants efficiently. Emailing our employees, recruiters, or Human Resources personnel directly will not influence your application.
Posted 2 months ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary We are looking for a talented Data Scientist to join our team. The ideal candidate will have a strong foundation in data analysis, statistical models, and machine learning algorithms. You will work closely with the team to solve complex problems and drive business decisions using data. This role requires strategic thinking, problem-solving skills, and a passion for data. Job Responsibilities Analyse large, complex datasets to extract insights and determine appropriate techniques to use. Build predictive models, machine learning algorithms and conduct A/B tests to assess the effectiveness of models. Present information using data visualization techniques. Collaborate with different teams (e.g., product development, marketing) and stakeholders to understand business needs and devise possible solutions. Stay updated with the latest technology trends in data science. Develop and implement real-time machine learning models for various projects. Engage with clients and consultants to gather and understand project requirements and expectations. Write well-structured, detailed, and compute-efficient code in Python to facilitate data analysis and model development. Utilize IDEs such as Jupyter Notebook, Spyder, and PyCharm for coding and model development. Apply agile methodology in project execution, participating in sprints, stand-ups, and retrospectives to enhance team collaboration and efficiency. Education IC - Typically requires a minimum of 5 years of related experience.Mgr & Exec - Typically requires a minimum of 3 years of related experience. At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees. This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process. Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification. Why NetApp? We are all about helping customers turn challenges into business opportunity. It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better - but also to innovate. We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches. We enable a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations. We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future. We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life. If you want to help us build knowledge and solve big problems, let's talk. Show more Show less
Posted 2 months ago
5.0 - 10.0 years
6 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Core skills, Python programming, PyTest/ Pycharm, API, framework development experience API integration experience with PyCharm / PyTest Experience in working/developing Test Automation Framework Development, Working with team for framework Deployment, writing unit test cases, giving demo Develop and Maintain Automation Framework using Python and Pytest very good experience in Tools/Libraries like github, devops, kubernates etc. Knowledge on HIL, Typhoon HIL, dspace (optional)
Posted 2 months ago
5.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Develop and customize Odoo modules as per business requirements. Implement and integrate Odoo with third-party applications. Maintain and customize existing Odoo modules. Create & customize reports. Troubleshoot and resolve issues related to Odoo modules and integrations. Setup, maintain & monitor Odoo servers. Test new functions/modifications to existing application modules in accordance with application support. Requirements At least 5-8 years of experience in Odoo development. Strong knowledge of Python programming language. Experience with Team Handling. Strong understanding of Odoo architecture and framework. Experience in Python, Javascript, XML, SQL. Experience in using VSCode, PyCharm IDEs. Should have Python experience as well as a solid understanding of Object-Oriented Design & programming. Experience with API (Rest API’s and SOAP API’s) and integration with Odoo applications. Requirements At least 5-8 years of experience in Odoo development. Strong knowledge of Python programming language. Experience with Team Handling. Strong understanding of Odoo architecture and framework. Experience in Python, Javascript, XML, SQL. Experience in using VSCode, PyCharm IDEs. Should have Python experience as well as a solid understanding of Object-Oriented Design & programming. Experience with API (Rest API’s and SOAP API’s) and integration with Odoo applications. Show more Show less
Posted 2 months ago
3.0 years
1 - 2 Lacs
Chennai
On-site
Job Title: Consultant - Data Marketplace Career Level - C2 Introduction to role We are recruiting for a Software Engineer for our Data Marketplace team. This is a full-time Grade D role based at our GITC site in Chennai. As part of AstraZeneca’s Enterprise Data Marketplace (EDM) capability, we are seeking a Software Engineer to take ownership of the hands-on engineering of user interfaces, APIs, data models, and workflows using BPMN, Groovy script, and other tools. You will be accountable for the clarity and accuracy of diagrams, excellence in code and test-driven development, and collaboration with the Business Analyst, Project Manager, Scrum Master, and Tech Lead (Chennai-based principal engineer) to ensure effective day-to-day execution of a metrics-driven SDLC. Accountabilities You will need a collaborative delivery approach to be successful. We use Scaled Agile with standard-length program increments (12-14 weeks). You will provide technical leadership throughout our software development lifecycle, from the initial development of a technical design based on a blueprint, right through to hypercare. Do you have a real passion for delivering well-engineered data and analytics solutions that can help improve patients’ lives? If you do, this will make you stand out from other applicants. Essential Skills/Experience Deep understanding and practical knowledge of IDE’s eg Eclipse/PyCharm or any Workflow Designer. Experience of one or more of the following languages: Java, JavaScript, Groovy, Python. Deep understanding and hands-on experience of CICD processes and tooling e.g. GitHub. Proven ability in converting a proposed design into an automated solution. Prior work experience in converting a business workflow into an automated set of actions. Proven knowledge in scripting and a willingness to learn new languages. Hands-on experience in database concepts and fair idea about how data gets stored in the backend and connecting them to the UI. Fantastic written & spoken English, interpersonal skills, and a collaborative approach to delivery. An enthusiasm for great documentation (e.g., high level designs, low level designs, coding standards, Knowledge Base Articles). Desirable Skills/Experience Engineering Degree in IT/Computer Science with minimum 3 years of experience Knowledge and experience of Collibra Data Governance platform. Any exposure to AI models and AI governance is an added advantage Knowledge and experience of MuleSoft and Snaplogic Excellent Jira skills including the ability to rapidly generate JQL on-the-fly and save JQL queries/filters/views/etc for publishing to fellow engineers & senior stakeholders. Creation of documentation in Confluence. Experience of Agile practices, preferably having been part of an Agile team for several years. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, our work has a direct impact on patients by transforming our ability to develop life-changing medicines. We empower the business to perform at its peak by combining cutting-edge science with leading digital technology platforms and data. Join us at a crucial stage of our journey in becoming a digital and data-led enterprise. Here you can innovate, take ownership, and explore new solutions in a dynamic environment that encourages lifelong learning and growth. Ready to make an impact? Apply now!
Posted 2 months ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary We are looking for a talented Data Scientist to join our team. The ideal candidate will have a strong foundation in data analysis, statistical models, and machine learning algorithms. You will work closely with the team to solve complex problems and drive business decisions using data. This role requires strategic thinking, problem-solving skills, and a passion for data. Job Responsibilities Analyse large, complex datasets to extract insights and determine appropriate techniques to use. Build predictive models, machine learning algorithms and conduct A/B tests to assess the effectiveness of models. Present information using data visualization techniques. Collaborate with different teams (e.g., product development, marketing) and stakeholders to understand business needs and devise possible solutions. Stay updated with the latest technology trends in data science. Develop and implement real-time machine learning models for various projects. Engage with clients and consultants to gather and understand project requirements and expectations. Write well-structured, detailed, and compute-efficient code in Python to facilitate data analysis and model development. Utilize IDEs such as Jupyter Notebook, Spyder, and PyCharm for coding and model development. Apply agile methodology in project execution, participating in sprints, stand-ups, and retrospectives to enhance team collaboration and efficiency. Education IC - Typically requires a minimum of 5 years of related experience.Mgr & Exec - Typically requires a minimum of 3 years of related experience. At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees. This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process. Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification. Why NetApp? We are all about helping customers turn challenges into business opportunity. It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better - but also to innovate. We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches. We enable a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations. We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future. We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life. If you want to help us build knowledge and solve big problems, let's talk. Submitting an application To ensure a streamlined and fair hiring process for all candidates, our team only reviews applications submitted through our company website. This practice allows us to track, assess, and respond to applicants efficiently. Emailing our employees, recruiters, or Human Resources personnel directly will not influence your application. Apply Show more Show less
Posted 2 months ago
1.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title: Human Capital Management (HCM) Strategy - Digital Strategy & Automation (DSA) Analyst / Senior Analyst Reports To: Head of Digital Strategy & Automation of Human Capital Management (HCM) Strategy Summary: We are seeking a motivated and experienced individual to join the HCM Strategy team as an Analyst / Senior Analyst in Digital Strategy & Automation. The HCM Strategy team manages transformational initiatives – enhancing our employee experience, driving quantifiable automation benefits, and promoting resiliency – to serve our stakeholders within HCM and across the firm. Your role as an Analyst / Senior Analyst within this team requires a blend of strategic thinking, technical expertise, and analytical abilities to support business intelligence, automation, and AI initiatives across all of Human Capital Management. This role is pivotal in driving digital transformation and enhancing operational efficiency within Human Capital Management. The ideal candidate will have 1-3 years of experience in business intelligence, automation, data analytics, and AI. They will demonstrate an ability to contribute to impactful solutions and support organizational change through digital strategy and task automation. Responsibilities Business Intelligence and Automation: Assist the development and deployment of business intelligence applications, ensuring alignment with strategic business objectives. Help synthesize complex analysis results into actionable insights and recommendations, influencing strategic business decisions. Identify, analyze, and resolve complex systems and algorithm performance trends or issues, developing mitigation strategies. AI And Data Science Support the development and implementation of AI-driven solutions to enhance business processes and decision-making. Utilize data science methodologies to analyze large datasets and generate predictive models. Collaborate with data scientists and engineers to integrate AI solutions into existing systems. Stay updated on emerging AI and data science trends and technologies, incorporating best practices. Project Management Assist in project planning, execution, and reporting, ensuring adherence to the project lifecycle. Manage risks and dependencies proactively, ensuring successful adoption of automation products. Support and guide other solution experts and advisors, fostering a collaborative environment and promoting knowledge sharing. Innovation And Strategy Contribute to the incubation of new low-code applications, identifying opportunities for innovation and driving adoption. Assist in complex and exploratory data analysis initiatives, ensuring adherence to best practices. Collaborate with Engineering to ensure automation solutions align with the firm's technology architecture strategy. Stakeholder Engagement Support stakeholder engagements, identifying and cultivating new low-code opportunities. Actively seek out and evaluate information and opportunities from internal and external sources, incorporating best practices. Qualifications Basic Qualifications: Bachelor’s degree or equivalent in Science, Technology, Engineering, or Mathematics. 2-5 years of experience in business intelligence, automation, and data analytics. Proficiency in digital strategy, business intelligence, automation, and artificial intelligence. Relevant experience in Consumer, Financial, Social Media, Tech, or FinTech sectors. Strong problem-solving and analytical skills. Excellent written and verbal communication skills. Ability to work independently and as part of a team. Knowledge of data-related emerging trends and issues, including financial regulation. Preferred Qualifications Solution Delivery Experience with implementing according to solution delivery frameworks such as Agile, Sig Sigma, Waterfall, etc. Able to contextualize analysis in Confluence, JIRA, MS applications etc Business Intelligence Working knowledge of analytics applications (i.e Alteryx, Tableau, Qlik, Power BI) Working knowledge of workflow applications (e.g. MS Power Platform, Appian, unqork, ServiceNow) Working knowledge of database tools (e.g. Mongo DB, Snowflake, Elastic, MS SQL) Artificial Intelligence Working knowledge of artificial intelligence programming languages (e.g. Python, R) Working knowledge of artificial intelligence computational packages (e.g. PyCharm, Scikit-Learn) Working knowledge of artificial intelligence platforms including robotics (e.g. Automation Anywhere, Anaconda, GitHub/Lab, Jupyter Hub, UiPath) About Goldman Sachs At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html © The Goldman Sachs Group, Inc., 2023. All rights reserved. Goldman Sachs is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, national origin, age, veterans status, disability, or any other characteristic protected by applicable law. Show more Show less
Posted 2 months ago
5.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title: Human Capital Management (HCM) Strategy - Digital Strategy & Automation (DSA) Associate / Vice President Reports To: Head of Digital Strategy & Automation of Human Capital Management (HCM) Strategy Summary: We are seeking a motivated and experienced individual to join the HCM Strategy team as a Vice President in Digital Strategy & Automation. The HCM Strategy team manages transformational initiatives – enhancing our employee experience, driving quantifiable automation benefits, and promoting resiliency – to serve our stakeholders within HCM and across the firm. Your role as a Vice President within this team requires a blend of strategic thinking, technical expertise, analytical abilities, and exceptional leadership qualities to drive business intelligence, automation, and AI initiatives across all functions of Human Capital Management. This role is pivotal in driving digital transformation and enhancing operational efficiency within Human Capital Management. The ideal candidate will have 5-7 years of experience in business intelligence, automation, data analytics, and AI. They will demonstrate a proven record to drive to impactful solutions and support organizational change through digital strategy and task automation. Responsibilities Business Intelligence and Automation: Lead the development and deployment of business intelligence applications, ensuring alignment with strategic business objectives. Synthesize complex analysis results into actionable insights and recommendations, influencing strategic business decisions. Proactively identify, analyze, and resolve complex systems and algorithm performance trends or issues, developing mitigation strategies. AI And Data Science Develop and implement AI-driven solutions to enhance business processes and decision-making. Utilize data science methodologies to analyze large datasets and generate predictive models. Collaborate with data scientists and engineers to integrate AI solutions into existing systems. Stay updated on emerging AI and data science trends and technologies, incorporating best practices. Project Management Oversee project planning, execution, and reporting, ensuring adherence to the project lifecycle. Manage risks and dependencies proactively, ensuring successful adoption of automation products. Mentor and guide other solution experts and advisors, fostering a collaborative environment and promoting knowledge sharing. Innovation And Strategy Champion the incubation of new low-code applications, identifying opportunities for innovation and driving adoption. Lead complex data analysis and exploratory data analysis initiatives, ensuring adherence to best practices. Collaborate strategically with Engineering to ensure automation solutions align with the firm's technology architecture strategy. Stakeholder Engagement Lead and manage stakeholder engagements, identifying and cultivating new low-code opportunities. Actively seek out and evaluate information and opportunities from internal and external sources, incorporating best practices. Qualifications Basic Qualifications: Bachelor’s degree or equivalent in Science, Technology, Engineering, or Mathematics. 5-7 years of experience in business intelligence, automation, and data analytics. Proficiency in digital strategy, business intelligence, automation, and artificial intelligence methodologies. Relevant experience in Consumer, Financial, Social Media, Tech, or FinTech sectors. Strong problem-solving and analytical skills. Excellent written and verbal communication skills. Ability to work independently and as part of a team. Knowledge of data-related emerging trends and issues, including financial regulation. Preferred Qualifications Solution Delivery Experience with implementing according to solution delivery frameworks such as Agile, Sig Sigma, Waterfall, etc. Able to contextualize analysis in Confluence, JIRA, MS applications etc Business Intelligence Working knowledge of analytics applications (i.e Alteryx, Tableau, Qlik, Power BI) Working knowledge of workflow applications (e.g. MS Power Platform, Appian, unqork, ServiceNow) Working knowledge of database tools (e.g. Mongo DB, Snowflake, Elastic, MS SQL) Artificial Intelligence Working knowledge of artificial intelligence programming languages (e.g. Python, R) Working knowledge of artificial intelligence computational packages (e.g. PyCharm, Scikit-Learn) Working knowledge of artificial intelligence platforms including robotics (e.g. Automation Anywhere, Anaconda, GitHub/Lab, Jupyter Hub, UiPath) Show more Show less
Posted 2 months ago
2.0 - 7.0 years
4 - 6 Lacs
Chennai
Work from Office
Greetings we are looking for Python Developer in Chennai Location . Please Note *This is a WFO Profile in Chennai *All the interviews are only in Walkin Mode in respective cities *Need minimum 2 year full time experience after Graduation *Trainee and internship/Trainee experience not considered Job Description Strong proficiency in Python programming language (version 3+). Should have at least 2 python framework experience (Django/Flask), Oops programming knowledge, Good in coding skill. knowledge of at least 1 python unit testing framework. Basics SQL knowledge. Working Days 5 Salary upto 6lpa Thanks & Regards HR TEAM KVC CONSULTANTS LTD NO PLACEMENT CHARGES
Posted 2 months ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About Position: We are seeking a skilled QA Automation Engineer to join our Mac team working with a remote team in Israel.. Role: QA Automation Engineer Mac Location: Pune Experience: 5-12+ years Job Type: Full Time Employment What You'll Do: Work with Mac product. End-to-end feature testing. Writing and analyzing automation tests in Python. In-depth knowledge of all types of testing. Ability to create a Testing plan, TDR meeting. Working with Jira, Confluence, Jenkins, Postman, PyCharm tools. Perform version releases based on demand. Conduct nightly automation analysis Participate in development cycles to ensure the delivery of high-quality products. Work effectively to deliver on planned and committed schedules. Expertise You'll Bring: Bachelor’s degree in Computer Science or a related field. Deep experience with Mac operating system Proven experience as a QA Automation Engineer in a Cloud/SaaS-based product environment. Proficiency in analysis automation tests using Python and Pytest. 5 years experience in QA software testing - A must. 2+ years of hands-on experience in writing and maintaining automation tests. Ability to multitask and work in a fast-paced environment. Ability to read logs and identify critical information. Team player with the ability to work independently and collaboratively . Creative and User-oriented, with a true passion for quality. Strong analytical skills for troubleshooting and problem-solving. Experience with AWS (advantage). Experience working in an agile methodology (advantage). Excellent written and verbal communication skills in English. A team player with excellent communication skills. A self-learner and independent executor. Competitive and results-oriented. Transparent in communication and actions. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.” Show more Show less
Posted 2 months ago
4.0 - 9.0 years
12 - 15 Lacs
Bengaluru
Work from Office
4+Years in storage domain Automation experience is must : Experience in Perl, Python, Core Experience in SAN,RAID & protocols (FC, ISCSI,Nvme) Experience In Manual ,Unit, Automation, Regression Testing Experience in Disaster Recovery solutions like MetroCluster Configuration, SMBC,SMAS, Async Cg, Svm DR Linux, Testcase design & Testcase execution.
Posted 2 months ago
5.0 - 7.0 years
8 - 10 Lacs
Chennai, Bengaluru
Work from Office
We are looking for an experienced Software Developer with 5+ years of expertise in Java or Python for data processing and automation. The ideal candidate should have strong proficiency in Java, Spring, Microservices, and REST API development. Experience with cloud platforms, specifically Google Cloud Platform (GCP), is required. The candidate must have hands-on experience in creating IDE plugins for PyCharm, VS Code, IntelliJ, and web consoles. Additionally, proficiency in Python and data pipeline development is mandatory. The role demands a deep understanding of software development principles, automation, and scalable system design.
Posted 2 months ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Emerging Technologies Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In emerging technology at PwC, you will focus on exploring and implementing cutting-edge technologies to drive innovation and transformation for clients. You will work in areas such as artificial intelligence, blockchain, and the internet of things (IoT). Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career in our New Technologies practice, within Application and Emerging Technology services, will provide you with a unique opportunity to help our clients identify and prioritize emerging technologies that can help solve their business problems. We help clients design approaches to integrate new technologies, skills, and processes so they can drive business results and innovation. Our team helps organizations to embrace emerging technologies to remain competitive and improve their business by solving complex questions. Our team focuses on identifying and prioritizing emerging technologies, breaking into new markets, and preparing clients to get the most out of their emerging technology investments. Responsibilities Creation and implementation of enterprise grade architectures for deployments of On-premises ML/LLM models for scalability and efficiency Working with SEO team for SEO pages generation using LLMs with reduction in AI scores and humanizing the AI generated content Fine-tuning LLM models and implementation of RAGs Interacting with multiple teams and understanding the new requirements, doing PoC on newly proposed requirements proposing various solutions for optimization, efficiency and cost saving Creation of Web Pages (Blogs) using regularly updated LLMs, with using NLP techniques for reduction of AI scores and improvement of SEO rankings Implementation of Quality assurance checks for the content being generated from LLMs Implementation of Lang Chains for achieving requirement specific use-cases Setting up automated ML pipelines on Azure ML Studio for scheduled training of ML models, with CI/CD setups for automating deployments of trained ML model pickle/joblib dumps on corresponding infra like K8s, Linux Servers using tools like Jenkins and version control tools like Git Experience on working with Frontend teams for image optimizations for improving the time and efficiency for rendering the web pages Writing shell scripting for monitoring the efficiency of the generated outputs of LLM models and providing corresponding estimates of required resources for scalability requirements Mandatory Skill Sets Python and open-source technologies (PySpark, PyTorch, TensorFlow, LangChain, Transformers) Azure data services like (Azure Synapse Analytics, Databricks, SQL Databases, Azure Machine Learning, Azure Cognitive Services etc.) Preferred Skill Sets Proficiency in Python and Azure, AWS DevOps & Data services. Proficiency in AI/ML model operations, and experience fine-tuning models and RAG implementation Experience in creation and implementation of Enterprise grade architectures for implementation and usage of Azure AI Services ensuring security, efficiency and scalability Experience in building components for data scientists and assisting them in all stages of Model development and deployment. Experience in Azure ML Studio for setting up pipelines for Model training and end point creation Experience with IDE/notebook software (Jupyter Studio, VSCode, PyCharm, etc) Experience in data extraction and analysis using Google Analytics, Azure Synapse and Big Query Proficiency in Looker Studio for dynamic dashboards creation and funnel drops analysis Writing Dockerfile, Shell Scripting and Kubernetes Manifests for deployment of customized applications on Servers and Kubernetes clusters Years Of Experience Required 3 to 8 years Education Qualification BE/B.tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Artificial Intelligence, Business Planning and Simulation (BW-BPS), Communication, Competitive Advantage, Conducting Research, Creativity, Digital Transformation, Embracing Change, Emotional Regulation, Empathy, Implementing Technology, Inclusion, Innovation Processes, Intellectual Curiosity, Internet of Things (IoT), Learning Agility, Optimism, Product Development, Product Testing, Prototyping, Quality Assurance Process Management {+ 9 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Indore, Madhya Pradesh, India
Remote
Skills: Python, FastAPI, Docker, OpenAI API, 11 Labs API, Machine Learning, Git, RESTful API Development, Company Overview Our company is at the forefront of innovative technology solutions, specializing in advanced API integrations and cutting-edge AI models. We pride ourselves on fostering a collaborative work environment that encourages creativity and growth. Job Overview We are seeking an experienced Python developer with expertise in Fast API, ElevenLabs/11 Labs, and OpenAI or Gemini to join our team on a work-from-home, remote-working contract basis. The ideal candidate will have 5+ years of experience in designing, developing, and deploying scalable, secure, and high-performance APIs. Qualifications And Skills Python Experience: 5+ years of experience in Python development, including Python 3.x, Python frameworks (e.g., Flask, Django), and Python libraries (e.g., NumPy, Pandas). Fast API Experience: 2+ years of experience with Fast API, including API development, API documentation, and API testing. ElevenLabs/11 Labs Experience: 1+ year of experience with ElevenLabs/11 Labs voice and video technologies, including text-to-speech, speech-to-text, and video conferencing. OpenAI or Gemini Experience: 1+ year of experience with OpenAI or Gemini AI models, including natural language processing, machine learning, and computer vision. Problem-Solving: Strong problem-solving skills, with the ability to analyze complex technical issues and identify creative solutions. Roles And Responsibilities Fast API Development: Design, develop, and deploy Fast API applications, including API endpoints, data models, and API documentation. ElevenLabs/11 Labs Integration: Integrate ElevenLabs/11 Labs voice and video technologies into Fast API applications, including text-to-speech, speech-to-text, and video conferencing. OpenAI or Gemini Integration: Integrate OpenAI or Gemini AI models into Fast API applications, including natural language processing, machine learning, and computer vision. Python Development: Develop scalable, secure, and high-performance Python applications, including data processing, data analysis, and data visualization. Database Integration: Integrate Fast API applications with databases, including relational databases (e.g., MySQL, PostgreSQL), NoSQL databases (e.g., MongoDB, Cassandra), and cloud-based databases (e.g., AWS Aurora, Google Cloud SQL). Testing and Debugging: Write unit tests and integration tests using Pytest, Unittest, or other testing frameworks, and debug Fast API applications using tools like pdb, ipdb, or PyCharm. Collaboration and Communication: Collaborate with cross-functional teams, including designers, project managers, and QA engineers. Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Bansdih, Uttar Pradesh, India
Remote
Skills: Python, FastAPI, Docker, OpenAI API, 11 Labs API, Machine Learning, Git, RESTful API Development, Company Overview Our company is at the forefront of innovative technology solutions, specializing in advanced API integrations and cutting-edge AI models. We pride ourselves on fostering a collaborative work environment that encourages creativity and growth. Job Overview We are seeking an experienced Python developer with expertise in Fast API, ElevenLabs/11 Labs, and OpenAI or Gemini to join our team on a work-from-home, remote-working contract basis. The ideal candidate will have 5+ years of experience in designing, developing, and deploying scalable, secure, and high-performance APIs. Qualifications And Skills Python Experience: 5+ years of experience in Python development, including Python 3.x, Python frameworks (e.g., Flask, Django), and Python libraries (e.g., NumPy, Pandas). Fast API Experience: 2+ years of experience with Fast API, including API development, API documentation, and API testing. ElevenLabs/11 Labs Experience: 1+ year of experience with ElevenLabs/11 Labs voice and video technologies, including text-to-speech, speech-to-text, and video conferencing. OpenAI or Gemini Experience: 1+ year of experience with OpenAI or Gemini AI models, including natural language processing, machine learning, and computer vision. Problem-Solving: Strong problem-solving skills, with the ability to analyze complex technical issues and identify creative solutions. Roles And Responsibilities Fast API Development: Design, develop, and deploy Fast API applications, including API endpoints, data models, and API documentation. ElevenLabs/11 Labs Integration: Integrate ElevenLabs/11 Labs voice and video technologies into Fast API applications, including text-to-speech, speech-to-text, and video conferencing. OpenAI or Gemini Integration: Integrate OpenAI or Gemini AI models into Fast API applications, including natural language processing, machine learning, and computer vision. Python Development: Develop scalable, secure, and high-performance Python applications, including data processing, data analysis, and data visualization. Database Integration: Integrate Fast API applications with databases, including relational databases (e.g., MySQL, PostgreSQL), NoSQL databases (e.g., MongoDB, Cassandra), and cloud-based databases (e.g., AWS Aurora, Google Cloud SQL). Testing and Debugging: Write unit tests and integration tests using Pytest, Unittest, or other testing frameworks, and debug Fast API applications using tools like pdb, ipdb, or PyCharm. Collaboration and Communication: Collaborate with cross-functional teams, including designers, project managers, and QA engineers. Show more Show less
Posted 2 months ago
8.0 - 12.0 years
12 - 18 Lacs
Bengaluru
Work from Office
As a Data Architect, you are required to: Design & develop technical solutions which combine disparate information to create meaningful insights for business, using Big-data architectures Build and analyze large, structured and unstructured databases based on scalable cloud infrastructures Develop prototypes and proof of concepts using multiple data-sources and big-data technologies Process, manage, extract and cleanse data to apply Data Analytics in a meaningful way Design and develop scalable end-to-end data pipelines for batch and stream processing Regularly scan the Data Analytics landscape to stay up to date with latest technologies, techniques, tools and methods in this field Stay curious and enthusiastic about using related technologies to solve problems and enthuse others to see the benefit in business domain Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Engineering / Analytics is desirable. Experience level : Minimum 8 years in software development with at least 2 - 3 years hands-on experience in the area of Big-data / Data Engineering. Desired Knowledge & Experience: Data Engineer - Big Data Developer Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design Additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Big Data Architect Expert: in technologies, languages and methodologies mentioned in Data Engineer - Big Data Developer Mentor: mentors/educates Developers in technologies, languages and methodologies mentioned in Data Engineer - Big Data Developer Architecture Styles: Lakehouse, Lambda, Kappa, Delta, Data Lake, Data Mesh, Data Fabric, Data Warehouses (e.g. Data Vault) Application Architecture: Microservices, NoSql, Kubernetes, Cloud-native Experience: Many years of experience with all kinds of technology in the evolution of data platforms (Data Warehouse -> Hadoop -> Big Data -> Cloud -> Data Mesh) Certification: Architect certification (e.g. Siemens Certified Software Architect or iSAQB CPSA) Required Soft-skills & Other Capabilities: Excellent communication skills, in order to explain your work to people who don't understand the mechanics behind data analysis Great attention to detail and the ability to solve complex business problems Drive and the resilience to try new ideas, if the first ones don't work Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.
Posted 2 months ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Harri: harri is the frontline employee experience platform built for companies who have service at the heart of their business. The solution is built on the notion that the customer experience will never exceed the employee experience. The Harri suite of talent attraction, workforce management and employee engagement technologies enable organizations to attract, manage, engage and retain the best talent for their business. Hospitality is in our DNA, with most of our global team having front line and management restaurant experience - we are changing the landscape of our industry and frontline workers technology. We need the very best and brightest to join us on this mission to disrupt the market as it stands today. Based in NYC, Harri has global offices in the UK, Palestine and India and has been awarded: Top 50 Startup by LinkedIn, Best Enterprise Solution for HR/Workforce by HR Tech Awards & NYC Best Tech Startup for the Tech in Motion Events Timmy Awards. If you’re a builder, or problem solver, and love the fast pace of a startup, it’s time to meet the Harri family. Who you are: We are seeking an experienced Lead Backend Engineer.. In this role, you will lead and mentor a team of engineers technical leadership, and Collaborate effectively with cross-functional teams, including product managers, frontend engineers, and QA, to define, develop, and deploy new features and enhancements. Position description The Lead Backend Engineer is responsible for designing, implementing, and maintaining scalable and efficient backend systems with a focus on performance, security, and reliability. The role requires expertise in Python, Django, databases (SQL & NoSQL), API development, and cloud services (AWS). Role and Responsibilities Duties and responsibilities for a Lead Backend Engineer position in our India team include, but are not limited to: Write clean, modular, reusable, testable, and well-documented code that adheres to our coding standards and promotes maintainability. Proficient in writing unit tests and performing integration testing to ensure code reliability, maintainability, and seamless interaction between components Troubleshoot and debug complex production issues, identifying root causes and implementing effective solutions in a timely manner. Design, implement, and maintain scalable, efficient, and secure backend systems, with a focus on performance and reliability for our global user base. Maintain alignment with HARRI’s global team(s) coding and design standards, ensuring consistency and interoperability. Demonstrate the ability to deliver high-quality work within agreed timelines. Proactively identify and implement optimizations to enhance system performance, ensuring high availability and responsiveness under varying loads. Architect and implement robust security structures and design efficient and scalable data storage solutions. Lead and mentor a team of engineers by providing technical guidance and support, including managing day-to-day activities, task assignments and deliverables. Take ownership of team deliverables, ensuring high quality and timely execution. Participate actively in the team expansion efforts, including sourcing, screening, and interviewing potential engineer candidates. Contribute to the onboarding and orientation of new team members. Provide regular progress updates to your line manager, highlighting achievements against established goals and key performance indicators (KPIs), challenges encountered, and potential roadblocks that may impact timelines or objectives. Collaborate effectively with cross-functional teams, including product managers, frontend engineers, and QA, to define, develop, and deploy new features and enhancements. Actively participate in knowledge sharing sessions, code reviews, and other team activities to foster a strong collaborative culture and contribute to the growth of team members. Stay current with relevant backend technologies, tools, and trends. Propose and drive the adoption of beneficial innovations, including AI. Qualifications: Bachelor's or Master's degree in Computer Science or a related field. Strong knowledge of relational databases and SQL, including: Proficiency in database design principles and best practices. Demonstrated ability to write and optimize complex SQL queries and stored procedures. Experience with NoSQL databases like DynamoDB or MongoDB is highly desirable. Experience in Python development Experience with Python web frameworks, specifically Django. Experience with Python Object-Relational Mappers (ORMs) such as Django ORM and SQLAlchemy. Excellent grasp of data structures, algorithms, and Object-Oriented Programming (OOP) principles. Proven experience in designing and implementing RESTful APIs, Graph, gRPC and Sockets. Proficient with Git for version control and collaborative development workflows especially GitHub. Hands-on experience with AI tools, IDEs, prompts, and protocols such as Cursor, Copilot agent mode with VSCode, Pycharm Copilot, MCP servers, and Copilot on Github.com Exceptional problem-solving and analytical abilities, with a proactive approach to identifying and resolving issues. Solid understanding of the Software Development Life Cycle (SDLC) and agile methodologies. Experience working effectively in Agile development environments (e.g., Scrum, Kanban), including SAFe. Familiarity with JIRA tracking and project management tools, including defect lifecycle management. Knowledge of shell scripting (e.g., bash) is a plus. Experience with Service-Oriented Architecture (SOA) and microservices architectural patterns and best practices is a significant plus. Practical experience working with Amazon Web Services (AWS) and its core services. Experience with Continuous Integration/Continuous Deployment (CI/CD) tools and pipelines. Demonstrated experience in leading and mentoring software engineers. Experience in the hiring process, including sourcing and interviewing candidates. Excellent verbal and written English communication skills, with the ability to articulate technical concepts clearly and concisely. Strong interpersonal and collaboration skills, with a proven ability to work effectively within a team. Skills Django ORM problem solving communication skills Python Django SQLAlchemy Show more Show less
Posted 2 months ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG X A&A team, you will work closely with consulting teams on a diverse range of advanced analytics topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytics subject matter expertise, and accelerated execution support. You will collaborate with case teams to gather requirements, specify, design, develop, deliver and support analytic solutions serving client needs. You will provide technical support through deeper understanding of relevant data analytics solutions and processes to build high quality and efficient analytic solutions. YOU'RE GOOD AT Working with case (and proposal) teams Acquiring deep expertise in at least one analytics topic & understanding of all analytics capabilities Defining and explaining expected analytics outcome; defining approach selection Delivering original analysis and insights to BCG teams, typically owning all or part of an analytics module and integrating with case teams Establishing credibility by thought partnering with case teams on analytics topics; drawing conclusions on a range of external and internal issues related to their module Communicating analytical insights through sophisticated synthesis and packaging of results (including PowerPoint presentation, Documents, dashboard and charts) with consultants, collects, synthesizes, analyses case team learning & inputs into new best practices and methodologies Build collateral of documents for enhancing core capabilities and supporting reference for internal documents; sanitizing confidential documents and maintaining a repository Able to lead workstreams and modules independently or with minimal supervision Ability to support business development activities (proposals, vignettes etc.) and build sales collateral to generate leads Team requirements: Guides juniors on analytical methodologies and platforms, and helps in quality checks Contributes to team's content & IP development Imparts technical trainings to team members and consulting cohort Technical Skills: Strong proficiency in statistics (concepts & methodologies like hypothesis testing, sampling, etc.) and its application & interpretation Hands-on data mining and predictive modeling experience (Linear Regression, Clustering (K-means, DBSCAN, etc.), Classification (Logistic regression, Decision trees/Random Forest/Boosted Trees), Timeseries (SARIMAX/Prophet)etc. Strong experience in at least one of the prominent cloud providers (Azure, AWS, GCP) and working knowledge of auto ML solutions (Sage Maker, Azure ML etc.) At least one tool in each category; Programming language - Python (Must have), (R Or SAS OR PySpark), SQL (Must have) Data Visualization (Tableau, QlikView, Power BI, Streamlit) , Data management (using Alteryx, MS Access, or any RDBMS) ML Deployment tools (Airflow, MLflow Luigi, Docker etc.) Big data technologies ( Hadoop ecosystem, Spark) Data warehouse solutions (Teradata, Azure SQL DW/Synapse, Redshift, BigQuery etc,) Version Control (Git/Github/Git Lab) MS Office (Excel, PowerPoint, Word) Coding IDE (VS Code/PyCharm) GenAI tools (OpenAI, Google PaLM/BERT, Hugging Face, etc.) Functional Skills: Expertise in building analytical solutions and delivering tangible business value for clients (similar to the use cases below) Price optimization, promotion effectiveness, Product assortment optimization and sales force effectiveness, Personalization/Loyalty programs, Labor Optimization CLM and revenue enhancement (segmentation, cross-sell/up-sell, next product to buy, offer recommendation, loyalty, LTV maximization and churn prevention) Communicating with confidence and ease: You will be a clear and confident communicator, able to deliver messages in a concise manner with strong and effective written and verbal communication. What You'll Bring Bachelor/Master's degree in a field linked to business analytics, statistics or economics, operations research, applied mathematics, computer science, engineering, or related field required; advanced degree preferred At least 2-4 years of relevant industry work experience providing analytics solutions in a commercial setting Prior work experience in a global organization, preferably in a professional services organization in data analytics role Demonstrated depth in one or more industries not limited to but including Retail, CPG, Healthcare, Telco etc Prior work experience in a global organization, preferably in a professional services organization in data analytics role to join our ranks. #BCGXjob Who You'll Work With Our data analytics and artificial intelligence professionals mix deep domain expertise with advanced analytical methods and techniques to develop innovative solutions that help our clients tackle their most pressing issues. We design algorithms and build complex models out of large amounts of data. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough