Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 years
0 - 0 Lacs
Ahmedabad, Gujarat
On-site
About Us: Red & White Education Pvt. Ltd., established in 2008, is Gujarats top NSDC & ISO-certified institute focused on skill-based education and global employability. Role Overview: Were hiring a full-time Onsite AI, Machine Learning, and Data Science Faculty/ Trainer with strong communication skills and a passion for teaching, Key Responsibilities: Deliver high-quality lectures on AI, Machine Learning, and Data Science . Design and update course materials, assignments, and projects. Guide students on hands-on projects, real-world applications, and research work. Provide mentorship and support for student learning and career development. Stay updated with the latest trends and advancements in AI/ML and Data Science. Conduct assessments, evaluate student progress, and provide feedback. Participate in curriculum development and improvements. Skills & Tools: Core Skills: ML, Deep Learning, NLP, Computer Vision, Business Intelligence, AI Model Development, Business Analysis. Programming: Python, SQL (Must), Pandas, NumPy, Excel. ML & AI Tools: Scikit-learn (Must), XGBoost, LightGBM, TensorFlow, PyTorch (Must), Keras, Hugging Face. Data Visualization: Tableau, Power BI (Must), Matplotlib, Seaborn, Plotly. NLP & CV: Transformers, BERT, GPT, OpenCV, YOLO, Detectron2. Advanced AI: Transfer Learning, Generative AI, Business Case Studies. Education & Experience Requirements: Bachelor's/Master’s/Ph.D. in Computer Science, AI, Data Science, or a related field. Minimum 1+ years of teaching or industry experience in AI/ML and Data Science. Hands-on experience with Python, SQL, TensorFlow, PyTorch, and other AI/ML tools. Practical exposure to real-world AI applications, model deployment, and business analytics. For further information, please feel free to contact 7862813693 us via email at career@rnwmultimedia.edu.in Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹35,000.00 per month Benefits: Flexible schedule Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person
Posted 1 month ago
0.0 - 2.0 years
6 - 8 Lacs
Jntu Kukat Pally, Hyderabad, Telangana
On-site
Python + Frontend Trainer (Full-Time ) Location: Manjeera Trinity, JNTU, Hyderabad Joining: Immediate Joiners Preferred Company: Greatcoder Trainings LLP About Us: Greatcoder Trainings LLP is a fast-growing tech training institute known for delivering real-time project-based training and offering 100% placement support. We are currently hiring a passionate and skilled Python + Frontend Trainer to join our expert team. Key Responsibilities: Conduct in-person training sessions on Python and Frontend technologies Prepare and deliver course materials, assignments, and real-time projects Resolve student queries and provide hands-on guidance Track student performance and offer constructive feedback Assist in mock interviews and assessments Required Skills: Core Python: Variables, Data Types, Loops, Functions, Modules Object-Oriented Programming in Python File Handling and Exception Handling Python Libraries: NumPy, Pandas, Matplotlib (preferred) Frameworks: Django or Flask (any one is a must) Basic understanding of REST APIs Frontend: HTML, CSS, JavaScript ReactJS: Practical and teaching experience Strong communication and presentation abilities Preferred Qualifications: Prior experience in teaching/training (offline or online) Bachelor’s degree in Computer Science or related field (preferred but not mandatory) Benefits: Flexible work modes (Full-time) Health insurance Real-time project exposure Friendly and growth-focused work culture Competitive salary Job Details: Job Type: Full-time Pay: ₹6,00,000 – ₹8,00,000 per year Schedule: Day shift Language: English (Preferred) Work Location: In person (Manjeera Trinity, JNTU, Hyderabad) Apply Now: Send your resume to bhupesh@thegreatcoder.com Contact: 9032190326 Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per year Benefits: Health insurance Schedule: Day shift Ability to commute/relocate: Jntu Kukat Pally, Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Preferred) Experience: Python: 2 years (Preferred) Language: Telugu (Preferred) English (Preferred) Work Location: In person
Posted 1 month ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position Overview In Scope of Position based Promotions (INTERNAL only) Job Title: Capital & Liquidity Management Analyst Location: Mumbai, India Corporate Title: Analyst Role Description Group Capital Management plays a central role in the execution of DB’s strategy. While Group Capital Management manages DB Group’s solvency ratios (CET 1, T1, Total capital ratio, leverage ratio, MREL/TLAC ratios, ECA ratio) together with business divisions and other infrastructure functions, EMEA Treasury manages in addition to the solvency ratios of DB’s EMEA entities also the liquidity ratios and Treasury Pool activities. Thereby, EMEA Treasury links into DB Group’s strategy and manages execution on a local level. Treasury Treasury at Deutsche Bank is responsible for the sourcing, management and optimization of liquidity and capital to deliver high value risk management decisions. This is underpinned by a best-in-class integrated and consistent Treasury risk framework, which enables Treasury to clearly identify the Bank’s resource demands, transparently set incentives by allocating resource costs to businesses and manage to evolving regulation. Treasury’s fiduciary mandate, which encompasses the Bank’s funding pools, Asset and liability management (ALM) and fiduciary buffer management, supports businesses in delivering their strategic targets at global and local level. Further Treasury manages the optimization of all financial resources through all lenses to implement the group’s strategic objective and maximize long term return on average tangible shareholders’ equity (RoTE). The current role is part of the Treasury Office in DBC Mumbai. The role requires interactions with all key hubs i.e. London, New York, Frankfurt and Singapore. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities The core deliverables for this role are… Write code and implement solution based on specifications. Update, design and implement changes to existing software architecture. Build complex enhancements and resolve bugs. Build and execute unit tests and unit plans. Implementation tasks are varied and complex needing independent judgment. Build a technology solution which is sustainable, repeatable, agile. Align with business and gain understanding of different treasury functions. Your Skills And Experience Must have core capabilities of strong development experience in Python and Oracle based application Strong in Algorithm, Data Structures and SQL Some experience with Integration/build/testing tools Good to have working knowledge of visualization libraries like plotly, matplotlib, seaborn etc. Exposure to webservice, webserver/application server-based development would be added advantage but not mandatory A basic understanding of Balance sheet and Treasury concepts is desirable but not mandatory Effective organizational and interpersonal skills Self-starting – willingness to get things done A highly motivated team player with strong technical background and good communication skills Urgency – Prioritize based on need of hour An aptitude to learn new tools and technologies Engineering graduate / BS or MS degree or equivalent experience relevant to functional area 3 + years software engineering or related experience is a must How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are currently seeking a Senior Python Developer to join our team for an exciting project that involves designing and building RESTful APIs for seamless communication between different components. In this role, you will be responsible for developing and maintaining microservices architecture using containerization tools such as Docker, AWS ECS, and ECR. Additionally, you will be required to demonstrate solutions to cross-functional teams and take ownership of the scope of work for successful project delivery. Responsibilities Develop and maintain microservices architecture using containerization tools such as Docker, AWS ECS, and ECR Design and build RESTful APIs for seamless communication between different components Present and organize demo sessions to demonstrate solutions to cross-functional teams Collaborate with cross-functional teams for successful project delivery Take ownership of the scope of work for successful project delivery Ensure consistency and scalability of applications and dependencies into containers Requirements 5-8 years of experience in software development using Python Proficient in AWS services such as Lambda, DynamoDB, CloudFormation, and IAM Strong experience in designing and building RESTful APIs Expertise in microservices architecture and containerization using Docker, AWS ECS, and ECR Ability to present and organize demo sessions to demonstrate solutions Excellent communication skills and ability to collaborate with cross-functional teams Strong sense of responsibility and ownership over the scope of work Nice to have Experience in DevOps tools such as Jenkins and GitLab for continuous integration and deployment Familiarity with NoSQL databases such as MongoDB and Cassandra Experience in data analysis and visualization using Python libraries such as Pandas and Matplotlib
Posted 1 month ago
6.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Agentic AI Engineer | 6+ Years | Remote | Work Timings: 2:00 PM to 11PM Job Description: AI/ML Model Development & Deployment – Focused on marketing and operations use cases Deep Learning – CNNs, RNNs, Transformers, Attention Mechanisms Generative AI – Experience with OpenAI (GPT, DALL·E, Whisper) and Anthropic (Claude) Agentic AI Platforms – AutoGen, CrewAI, AWS Bedrock Multimodal AI – Building agents using text, voice, and other inputs for automation Python Programming – Proficient with NumPy, Pandas, Matplotlib, TensorFlow, PyTorch Traditional ML Techniques – Supervised/Unsupervised Learning, PCA, Feature Engineering, Model Evaluation, Hyperparameter Tuning Data Analytics – Predictive analysis, clustering, A/B testing, KPI monitoring MLOps & CI/CD – Model versioning, deployment pipelines, monitoring Cloud Services – AWS (S3, Lambda, EC2, SageMaker, Bedrock), Serverless architectures Development Tools – Proficient in using Cursor for design, development, and code reviews Communication & Collaboration – Strong communication skills and client engagement experience Domain Expertise – Marketing-focused AI solutions (big plus)
Posted 1 month ago
1.0 - 4.0 years
6 - 11 Lacs
Hyderabad
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking a Deep Learning Engineer with at least 3 years of experience to work on building state-of-the-art deep learning models for various applications like computer vision, NLP, and recommendation systems. Key Responsibilities:. Develop and train deep learning models for various use cases. Optimize model performance and ensure scalability. Collaborate with the data science and engineering teams to integrate deep learning models into production systems. Required Qualifications:. 3+ years of experience in deep learning and machine learning. Expertise in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong programming skills in Python and experience with GPU computing. Why Join Us. Competitive pay (‚1200/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. 1200 per hour (if you work an average of 3 hours every day that could be as high as Rs. . Shape the future of AI with Soul AI!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
1.0 - 4.0 years
6 - 11 Lacs
Bengaluru
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking a Deep Learning Engineer with at least 3 years of experience to work on building state-of-the-art deep learning models for various applications like computer vision, NLP, and recommendation systems. Key Responsibilities:. Develop and train deep learning models for various use cases. Optimize model performance and ensure scalability. Collaborate with the data science and engineering teams to integrate deep learning models into production systems. Required Qualifications:. 3+ years of experience in deep learning and machine learning. Expertise in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong programming skills in Python and experience with GPU computing. Why Join Us. Competitive pay (‚1200/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. 1200 per hour (if you work an average of 3 hours every day that could be as high as Rs. . Shape the future of AI with Soul AI!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
1.0 - 4.0 years
6 - 11 Lacs
Mumbai
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking a Deep Learning Engineer with at least 3 years of experience to work on building state-of-the-art deep learning models for various applications like computer vision, NLP, and recommendation systems. Key Responsibilities:. Develop and train deep learning models for various use cases. Optimize model performance and ensure scalability. Collaborate with the data science and engineering teams to integrate deep learning models into production systems. Required Qualifications:. 3+ years of experience in deep learning and machine learning. Expertise in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong programming skills in Python and experience with GPU computing. Why Join Us. Competitive pay (‚1200/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. 1200 per hour (if you work an average of 3 hours every day that could be as high as Rs. . Shape the future of AI with Soul AI!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Kolkata
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
1.0 - 4.0 years
6 - 11 Lacs
Kolkata
Work from Office
:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking a Deep Learning Engineer with at least 3 years of experience to work on building state-of-the-art deep learning models for various applications like computer vision, NLP, and recommendation systems. Key Responsibilities:. Develop and train deep learning models for various use cases. Optimize model performance and ensure scalability. Collaborate with the data science and engineering teams to integrate deep learning models into production systems. Required Qualifications:. 3+ years of experience in deep learning and machine learning. Expertise in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong programming skills in Python and experience with GPU computing. Why Join Us. Competitive pay (‚1200/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. 1200 per hour (if you work an average of 3 hours every day that could be as high as Rs. . Shape the future of AI with Soul AI!.
Posted 1 month ago
200.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description You are a strategic thinker passionate about driving solution. You have found the right team. As an Associate within the VCG team, your primary responsibility will be to work on automation and redesign of existing implementations using Python. Alteryx skills are considered a plus. Job Responsibilities Automate Excel tasks by developing Python scripts with openpyxl, pandas, and xlrd, focusing on data extraction, transformation, and generating reports with charts and pivot tables. Design and deploy interactive web applications using Streamlit, enabling real-time data interaction and integrating advanced analytics. Use Matplotlib and Seaborn to create charts and graphs, adding interactive features for dynamic data exploration tailored to specific business needs. Design intuitive user interfaces with PyQt or Flask, integrating data visualizations and ensuring secure access through authentication mechanisms. Perform data manipulation and exploratory analysis using Pandas and NumPy, and develop data pipelines to maintain data quality and support analytics. Write scripts to connect to external APIs, process data in JSON and XML formats, and ensure reliable data retrieval with robust error handling. Collaborate with cross-functional teams to gather requirements, provide technical guidance, and ensure alignment on project goals, fostering open communication. Demonstrate excellent problem-solving skills and the ability to troubleshoot and resolve technical issues. Adhere to the control, governance, and development standards for intelligent solutions. Strong communication skills and the ability to work collaboratively with different teams. Required Qualifications, Capabilities, And Skills Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in Python programming and automation. Experience with Python libraries such as Pandas, NumPy, PyQt, Streamlit, Matplotlib, Seaborn, openpyxl, xlrd, Flask, PyPDF2, pdfplumber and SQLite . Analytical, quantitative aptitude, and attention to detail. Strong verbal and written communication skills. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success. Global Finance & Business Management works to strategically manage capital, drive growth and efficiencies, maintain financial reporting and proactively manage risk. By providing information, analysis and recommendations to improve results and drive decisions, teams ensure the company can navigate all types of market conditions while protecting our fortress balance sheet.
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Trainee – Intern Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month About the Role We are looking for a passionate and self-motivated Machine Learning Trainee (Intern) to join our team and gain hands-on experience in building and deploying machine learning models. As a trainee, you’ll work alongside experienced data scientists and engineers on real-world datasets and contribute to projects that have a meaningful impact. Key Responsibilities Assist in data preprocessing, cleaning, and feature engineering for various ML tasks. Support the development, training, testing, and evaluation of machine learning models. Participate in model deployment and performance monitoring. Conduct literature reviews and research to identify appropriate ML algorithms. Visualize and interpret results using tools like Matplotlib, Seaborn, or Power BI. Document workflows, experiments, and outcomes clearly and concisely. Requirements Pursuing or recently completed a degree in Computer Science, Data Science, Engineering, or a related field. Strong knowledge of Python and popular ML libraries (scikit-learn, pandas, NumPy, etc.). Basic understanding of machine learning concepts such as supervised and unsupervised learning, model evaluation, and overfitting. Familiarity with Jupyter Notebooks, version control (Git), and data handling. Good problem-solving and analytical skills. Preferred Qualifications Exposure to deep learning frameworks (TensorFlow, PyTorch) is a plus. Understanding of cloud platforms (AWS, GCP, or Azure) is a bonus. Prior hands-on experience through academic or personal ML projects. What You’ll Gain Hands-on experience with real-world ML problems and datasets. Mentorship from industry professionals. Certificate of Internship upon successful completion. A strong foundation to pursue advanced roles in AI and ML.
Posted 1 month ago
3.0 - 8.0 years
8 - 13 Lacs
Mumbai
Work from Office
: In Scope of Position based Promotions (INTERNAL only) Job TitleCapital & Liquidity Management Analyst LocationMumbai, India Corporate TitleAnalyst Role Description Group Capital Management plays a central role in the execution of DBs strategy. While Group Capital Management manages DB Groups solvency ratios (CET 1, T1, Total capital ratio, leverage ratio, MREL/TLAC ratios, ECA ratio) together with business divisions and other infrastructure functions, EMEA Treasury manages in addition to the solvency ratios of DBs EMEA entities also the liquidity ratios and Treasury Pool activities. Thereby, EMEA Treasury links into DB Groups strategy and manages execution on a local level. Treasury Treasury at Deutsche Bank is responsible for the sourcing, management and optimization of liquidity and capital to deliver high value risk management decisions. This is underpinned by a best-in-class integrated and consistent Treasury risk framework, which enables Treasury to clearly identify the Banks resource demands, transparently set incentives by allocating resource costs to businesses and manage to evolving regulation. Treasurys fiduciary mandate, which encompasses the Banks funding pools, Asset and liability management (ALM) and fiduciary buffer management, supports businesses in delivering their strategic targets at global and local level. Further Treasury manages the optimization of all financial resources through all lenses to implement the groups strategic objective and maximize long term return on average tangible shareholders equity (RoTE). The current role is part of the Treasury Office in DBC Mumbai. The role requires interactions with all key hubs i.e. London, New York, Frankfurt and Singapore. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities The core deliverables for this role are Write code and implement solution based on specifications. Update, design and implement changes to existing software architecture. Build complex enhancements and resolve bugs. Build and execute unit tests and unit plans. Implementation tasks are varied and complex needing independent judgment. Build a technology solution which is sustainable, repeatable, agile. Align with business and gain understanding of different treasury functions. Your skills and experience Must have core capabilities of strong development experience in Python and Oracle based application Strong in Algorithm, Data Structures and SQL Some experience with Integration/build/testing tools Good to have working knowledge of visualization libraries like plotly, matplotlib, seaborn etc. Exposure to webservice, webserver/application server-based development would be added advantage but not mandatory A basic understanding of Balance sheet and Treasury concepts is desirable but not mandatory Effective organizational and interpersonal skills Self-starting willingness to get things done A highly motivated team player with strong technical background and good communication skills Urgency Prioritize based on need of hour An aptitude to learn new tools and technologies Engineering graduate / BS or MS degree or equivalent experience relevant to functional area 3 + years software engineering or related experience is a must How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
4.0 years
0 Lacs
India
Remote
Job Title: AI/ML Engineer Experience: 4+ Years Location: Remote Job Type: Full-Time Job Summary: We are looking for a passionate and results-driven AI/ML Engineer with 4 years of experience in designing, building, and deploying machine learning models and intelligent systems. The ideal candidate should have solid programming skills, a strong grasp of data preprocessing, model evaluation, and MLOps practices. You will collaborate with cross-functional teams including data scientists, software engineers, and product managers to integrate intelligent features into applications and systems. Key Responsibilities: Design, develop, train, and optimize machine learning and deep learning models for real-world applications. Preprocess, clean, and transform structured and unstructured data for model training and evaluation. Implement, test, and deploy models using APIs or microservices (Flask, FastAPI, etc.) in production environments. Use ML libraries and frameworks like Scikit-learn, TensorFlow, PyTorch, Hugging Face, XGBoost, etc. Monitor and retrain models as needed for performance, accuracy, and drift mitigation. Collaborate with software and data engineering teams to operationalize ML solutions using MLOps tools. Stay updated with emerging trends in AI/ML and suggest enhancements to existing systems. Required Skills and Qualifications: Bachelor’s or Master’s in Computer Science, Engineering, AI/ML, Data Science, or related field. 4+ years of hands-on experience in machine learning model development and deployment. Strong experience in Python and libraries like Pandas, NumPy, Scikit-learn, Matplotlib/Seaborn. Experience with deep learning frameworks such as TensorFlow, PyTorch, or Keras. Proficiency in model deployment using Flask, FastAPI, Docker, and REST APIs. Experience with version control (Git), model versioning, and experiment tracking (MLflow, Weights & Biases). Familiarity with cloud platforms like AWS (SageMaker), Azure ML, or GCP AI Platform. Knowledge of databases (SQL/NoSQL) and data pipelines (Airflow, Spark, etc.). Strong problem-solving and debugging skills, with an analytical mindset.
Posted 1 month ago
6.0 - 8.0 years
0 Lacs
Greater Chennai Area
On-site
About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function The Intermediate Holding Company (“IHC”) program structured at the U.S. level across poles of activities of BNP Paribas provides guidance, supports the analysis, impact assessment and drives adjustments of the U.S. platform’s operating model due to the drastic changes introduced by the Enhanced Prudential Standards (“EPS”) for Foreign Banking Organizations (“FBOs”) finalized by the Federal Reserve in February 2014, implementing Section 165 of U.S. Dodd-Frank Act. The IT Transversal Team is part of the Information Technology Group which works simultaneously on a wide range of projects arising from business, strategic initiatives, and regulatory changes and reengineering of existing applications to improve functionality and efficiency. Job Title Python Developer Date June-25 Department ITG- Fresh Location: Chennai, Mumbai Business Line / Function Finance Dedicated Solutions Reports To (Direct) Grade (if applicable) (Functional) Number Of Direct Reports NA Directorship / Registration NA Position Purpose The Python Developer will play a critical role in building and maintaining financial applications and tools that support data processing, analysis, and reporting within a fast-paced financial services environment. This position involves developing scalable and secure systems. The developer will collaborate with business analysts, finance users/or finance BA to translate complex business requirements into efficient, high-quality software solutions. A strong understanding of financial concepts, data integrity, and regulatory compliance is essential. The detailed responsibilities are mentioned below. Responsibilities Direct Responsibilities Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Technical & Behavioral Competencies Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Good analytical, problem solving, & communication skills Engage in technical discussions and to help in improving the system, process etc Nice to Have Familiarity with Plotly and Matplotlib for data visualization of large datasets. Skilled in API programming, handling JSON, CSV, and other unstructured data from various systems. Familiarity with JavaScript, CSS, and HTML. Experience with cloud architecture applications such as Dataiku or Databricks; competency with ETL tools. Knowledge of regulatory frameworks, RISK, CCAR, and GDPR. Skills Referential Specific Qualifications (if required) Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Ability to deliver / Results driven Communication skills - oral & written Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to understand, explain and support change Ability To Develop Others & Improve Their Skills Choose an item. Education Level Bachelor Degree or equivalent Experience Level At least 5 years
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp – 15 to 23yrs Location : Chennai /Bangalore Primary skill :- GEN AI Architect, Building GEN AI solutions, Coding, AI ML background, Data engineering, Azure or AWS cloud Job Description : The Generative Solutions Architect will be responsible for designing and implementing cutting-edge generative AI models and systems. He / She will collaborate with data scientists, engineers, product managers, and other stakeholders to develop innovative AI solutions for various applications including natural language processing (NLP), computer vision, and multimodal learning. This role requires a deep understanding of AI/ML theory, architecture design, and hands-on expertise with the latest generative models. Key Responsibilities : GenAI application conceptualization and design: Understand the use cases under consideration, conceptualization of the application flow, understanding the constraints and designing accordingly to get the most optimized results. Deep knowledge to work on developing and implementing applications using Retrieval-Augmented Generation (RAG)-based models, which combine the power of large language models (LLMs) with information retrieval techniques. Prompt Engineering: Be adept at prompt engineering and its various nuances like one-shot, few shot, chain of thoughts etc and have hands on knowledge of implementing agentic workflow and be aware of agentic AI concepts NLP and Language Model Integration - Apply advanced NLP techniques to preprocess, analyze, and extract meaningful information from large textual datasets. Integrate and leverage large language models such as LLaMA2/3, Mistral or similar offline LLM models to address project-specific goals. Small LLMs / Tiny LLMs: Familiarity and understanding of usage of SLMs / Tiny LLMs like phi3, OpenELM etc and their performance characteristics and usage requirements and nuances of how they can be consumed by use case applications. Collaboration with Interdisciplinary Teams - Collaborate with cross-functional teams, including linguists, developers, and subject matter experts, to ensure seamless integration of language models into the project workflow. Text / Code Generation and Creative Applications - Explore creative applications of large language models, including text / code generation, summarization, and context-aware responses. Skills & Tools Programming Languages - Proficiency in Python for data analysis, statistical modeling, and machine learning. Machine Learning Libraries - Hands-on experience with machine learning libraries such as scikit-learn, Huggingface, TensorFlow, and PyTorch. Statistical Analysis - Strong understanding of statistical techniques and their application in data analysis. Data Manipulation and Analysis - Expertise in data manipulation and analysis using Pandas and NumPy. Database Technologies - Familiarity with vector databases like ChromaDB, Pinecone etc, SQL and Non-SQL databases and experience working with relational and non-relational databases. Data Visualization Tools - Proficient in data visualization tools such as Tableau, Matplotlib, or Seaborn. Familiarity with cloud platforms (AWS, Google Cloud, Azure) for model deployment and scaling. Communication Skills - Excellent communication skills with the ability to convey technical concepts to non-technical audiences.
Posted 1 month ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Are you passionate about turning complex data into actionable insights? We're looking for a creative and analytical Data Analyst to join our team and help shape data-driven strategies across the business. This is an exciting opportunity to work in a dynamic, collaborative environment where your models and analyses will directly influence high-impact decisions. Dassault Systèmes, the 3DEXPERIENCE Company, provides businesses and people with virtual universes to imagine sustainable innovations. Our 3DEXPERIENCE platform leverages the Company’s world-leading 3D software applications to transform the way products are designed, produced, and supported. With its online architecture, the 3DEXPERIENCE environment helps businesses to test and evaluate — anywhere in the development lifecycle of a product or service — the eventual experience they will deliver to their customers. In short, 3DEXPERIENCE powers the next-generation capabilities that drive today’s Experience Economy. Role Description & Responsibilities Analyze large, complex datasets to uncover insights, trends, and opportunities that drive strategic decisions Build and deploy predictive models using machine learning techniques (e.g., regression, classification, clustering) Perform data wrangling, preprocessing, and cleaning to prepare data for analysis and modeling Design and execute experiments (e.g., A/B testing) to evaluate hypotheses and business initiatives Communicate findings and recommendations clearly to both technical and non-technical stakeholders through reports, dashboards, and presentations Collaborate closely with data engineers, product managers, and business teams to define data requirements and deliver end-to-end solutions Stay up to date with industry trends, tools, and best practices in data science and analytics Qualifications Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field 8+ years of experience in a data science or quantitative analytics role with a good understanding of machine learning and operations research Knowledge of R, SQL and Python; familiarity with Scala, Java or C++ is an asset Experience in data mining across different industry segments like Automobile, Infrastructure & Equipment, Aerospace & Defense, manufacturing, Mining etc Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib, Seaborn) Excellent communication skills, with the ability to explain complex data insights in a clear and actionable way What’s In It For You Professional Growth: Opportunity to advance within the organization Learning Environment: Access to training, workshops, and skill development Collaboration: Work closely with cross-functional teams Company Culture: Work in a culture of collaboration and innovation Interested? Click on "Apply" to upload your application documents. Inclusion statement As a game-changer in sustainable technology and innovation, Dassault Systèmes is striving to build more inclusive and diverse teams across the globe. We believe that our people are our number one asset and we want all employees to feel empowered to bring their whole selves to work every day. It is our goal that our people feel a sense of pride and a passion for belonging. As a company leading change, it’s our responsibility to foster opportunities for all people to participate in a harmonized Workforce of the Future.
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Details Category: Data Science Location : Bangalore Experience Level: 4-8 Years Position Description We are looking for a Data Engineer who will play a pivotal role in transforming raw data into actionable intelligence through sophisticated data pipelines and machine learning deployment frameworks. They will collaborate across functions to understand business objectives, engineer data solutions, and ensure robust AI/ML model deployment and monitoring. This role is ideal for someone passionate about data science, MLOps, and building scalable data ecosystems in cloud environments. Key Responsibilities Data Engineering & Data Science: Preprocess structured and unstructured data to prepare for AI/ML model development. Apply strong skills in feature engineering, data augmentation, and normalization techniques. Manage and manipulate data using SQL, NoSQL, and cloud-based data storage solutions such as Azure Data Lake. Design and implement efficient ETL pipelines, data wrangling, and data transformation strategies. Model Deployment & MLOps Deploy ML models into production using Azure Machine Learning (Azure ML) and Kubernetes. Implement MLOps best practices, including CI/CD pipelines, model versioning, and monitoring frameworks. Design mechanisms for model performance monitoring, alerting, and retraining. Utilize containerization technologies (Docker/Kubernetes) to support deployment and scalability Business & Analytics Insights Work closely with stakeholders to understand business KPIs and decision-making frameworks. Analyze large datasets to identify trends, patterns, and actionable insights that inform business strategies. Develop data visualizations using tools like Power BI, Tableau, and Matplotlib to communicate insights effectively. Conduct A/B testing and evaluate model performance using metrics such as precision, recall, F1-score, MSE, RMSE, and model validation techniques. Desired Profile Proven experience in data engineering, AI/ML data preprocessing, and model deployment. Strong expertise in working with both structured and unstructured datasets. Hands-on experience with SQL, NoSQL databases, and cloud data platforms (e.g., Azure Data Lake). Deep understanding of MLOps practices, containerization (Docker/Kubernetes), and production-level model deployment. Technical Skills Proficient in ETL pipeline creation, data wrangling, and transformation methods. Strong experience with Azure ML, Kubernetes, and other cloud-based deployment technologies. Excellent knowledge of data visualization tools (Power BI, Tableau, Matplotlib). Expertise in model evaluation and testing techniques, including A/B testing and performance metrics. Soft Skills Strong analytical mindset with the ability to solve complex data-related problems. Ability to collaborate with cross-functional teams to understand business needs and provide actionable insights. Clear communication skills to convey technical details to non-technical stakeholders. If you are passionate to work in a collaborative and challenging environment, apply now!
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai, Chennai, Bengaluru
Work from Office
Location:Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must have skills: A solid understanding of retail industry dynamics, including key performance indicators (KPIs) such as sales trends, customer segmentation, inventory turnover, and promotions. Strong ability to communicate complex data insights to non-technical stakeholders, including senior management, marketing, and operational teams. Meticulous in ensuring data quality, accuracy, and consistency when handling large, complex datasets. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Strong proficiency in Python for data manipulation, statistical analysis, and machine learning (libraries like Pandas, NumPy, Scikit-learn). Expertise in supervised and unsupervised learning algorithms Use advanced analytics to optimize pricing strategies based on market demand, competitor pricing, and customer price sensitivity. Good to have skills: Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms such as AWS or Google Cloud for large-scale data processing. Experience with ETL (Extract, Transform, Load) processes and tools like Apache Airflow to automate data workflows. Familiarity with designing scalable and efficient data pipelines and architecture. Experience with tools like Tableau, Power BI, Matplotlib, and Seaborn to create meaningful visualizations that present data insights clearly. Job Summary : The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help our retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Roles & Responsibilities: Leverage Retail Knowledge:Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. Use AI-driven techniques for personalization, demand forecasting, and fraud detection. Use advanced statistical methods help optimize existing use cases and build new products to serve new challenges and use cases. Stay updated on the latest trends in data science and retail technology. Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills : Strong analytical and statistical skills. Expertise in machine learning and AI. Experience with retail-specific datasets and KPIs. Proficiency in data visualization and reporting tools. Ability to work with large datasets and complex data structures. Strong communication skills to interact with both technical and non-technical stakeholders. A solid understanding of the retail business and consumer behavior. Programming Languages:Python, R, SQL, Scala Data Analysis Tools:Pandas, NumPy, Scikit-learn, TensorFlow, Keras Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn Big Data Technologies:Hadoop, Spark, AWS, Google Cloud Databases: SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Qualification Experience: Minimum 3 year(s) of experience is required Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
Posted 1 month ago
5.0 - 8.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to develop minimum viable product (MVP) and comprehensive AI solutions that meet and exceed clients expectations and add value to business. Primary Skill Python (DSA) Exploratory data analysis using pandas,numpy, Seaborn, Matplotlib TensorFlow, PyTorch, Scikit-learn Large-language models, RAG, MCP , Agentic AI Azure Cognitive services /Azure AI found . Vertex- AI Github Agile Problem solving skills System design thinking Observability Mandatory Skills: Generative AI. Experience5-8 Years.
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Total exp - ( 12 to 15 yrs ) Roles & Responsibilities Design and develop innovative AI/ML solutions to address business challenges. Create and implement AI strategies and long-term roadmaps for AI adoption. Develop and deploy AI models, ensuring alignment with business needs. Collaborate with teams to translate business requirements into AI solutions. Stay updated on the latest AI/ML technologies and practices. Mentor junior AI developers and conduct code reviews. Work with data scientists and engineers to preprocess and prepare data. Select suitable AI/ML algorithms and deploy models in production. Ensure data privacy, security, scalability, and performance of AI applications. Requirements . Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or related field. Proficiency in programming languages like Python or Java. Experience with ML frameworks (TensorFlow, PyTorch, scikit-learn). Expertise in data manipulation (Pandas, NumPy) and visualization (Matplotlib, Seaborn). Familiarity with cloud platforms (AWS, Azure, Google Cloud) and version control (Git). Knowledge of databases (SQL, NoSQL) and development tools (Jupyter Notebook, IDEs). Understanding of Agile methodologies (Scrum, Kanban). Nice to Have Proven experience in developing and deploying AI/ML solutions. Skills in NLP, computer vision, and deep learning. Proficiency in cloud tools like EC2, ASG, S3, RDS. Strong problem-solving, communication, and collaboration skills.
Posted 1 month ago
0.0 - 2.0 years
4 - 8 Lacs
Chennai
Remote
We're Hiring: Jr. Algorithm Engineer (Remote | Full-Time / Internship) Are you a recent graduate or an early-career professional passionate about algorithms, Python, and solving real-world problems through code? Were looking for a Jr. Algorithm Engineer to join our team remotely. This is a great opportunity to work on meaningful projects while learning from experienced engineers. What Were Looking For: Strong understanding of Python & C# syntax, file handling, data types Basics of OOP , familiarity with libraries like NumPy, Pandas, Flask, Matplotlib Knowledge of databases (SQLite, PostgreSQL) Comfort with Git/GitHub , and IDEs like Visual Studio or PyCharm Excellent mathematical aptitude , logical reasoning , and algorithmic thinking Eligibility: Science & Tech graduates/postgraduates (excluding business/management backgrounds) 02 years of experience Interns & short-term candidates (min. 4 hours/day) are welcome NOC required for student applicants Engagement: Full-time with an initial assessment period (up to 3 months) Internships available for exceptional learners 100% remote If youre eager to grow, solve problems, and build smart systems — we want to hear from you! To apply, send your resume to hr@vectraglobal.com with the subject: Application – Jr. Algorithm Engineer Let’s build the future together. #Hiring #AlgorithmEngineer #RemoteJobs #PythonJobs #EntryLevelJobs #EngineeringCareers #InternshipOpportunity #CSharp #TechJobs
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France