Jobs
Interviews

4844 Numpy Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 1.0 years

0 - 0 Lacs

Chandkheda, Ahmedabad, Gujarat

On-site

About Us: Red & White Education Pvt. Ltd., established in 2008, is Gujarats top NSDC & ISO-certified institute focused on skill-based education and global employability. Role Overview: Were hiring a full-time Onsite AI, Machine Learning, and Data Science Faculty/ Trainer with strong communication skills and a passion for teaching, Key Responsibilities: Deliver high-quality lectures on AI, Machine Learning, and Data Science . Design and update course materials, assignments, and projects. Guide students on hands-on projects, real-world applications, and research work. Provide mentorship and support for student learning and career development. Stay updated with the latest trends and advancements in AI/ML and Data Science. Conduct assessments, evaluate student progress, and provide feedback. Participate in curriculum development and improvements. Skills & Tools: Core Skills: ML, Deep Learning, NLP, Computer Vision, Business Intelligence, AI Model Development, Business Analysis. Programming: Python, SQL (Must), Pandas, NumPy, Excel. ML & AI Tools: Scikit-learn (Must), XGBoost, LightGBM, TensorFlow, PyTorch (Must), Keras, Hugging Face. Data Visualization: Tableau, Power BI (Must), Matplotlib, Seaborn, Plotly. NLP & CV: Transformers, BERT, GPT, OpenCV, YOLO, Detectron2. Advanced AI: Transfer Learning, Generative AI, Business Case Studies. Education & Experience Requirements: Bachelor's/Master’s/Ph.D. in Computer Science, AI, Data Science, or a related field. Minimum 1+ years of teaching or industry experience in AI/ML and Data Science. Hands-on experience with Python, SQL, TensorFlow, PyTorch, and other AI/ML tools. Practical exposure to real-world AI applications, model deployment, and business analytics. For further information, please feel free to contact 7862813693 us via email at career@rnwmultimedia.edu.in Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹35,000.00 per month Benefits: Flexible schedule Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Experience: Teaching / Mentoring: 1 year (Required) AI: 1 year (Required) ML : 1 year (Required) Data science: 1 year (Required) Work Location: In person

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Data Analyst (Data Visualization & Reporting) Key Responsibilities Work with large datasets from various inspection sources (LiDAR, drones, thermal imaging). Build insightful dashboards and reports using tools like Power BI, Tableau, or Looker. Develop and deploy predictive models and statistical analyses to detect anomalies and prevent failures. Collaborate with engineering and operations teams to translate complex data into operational insights. Ensure high-quality, clean, and consistent data by implementing validation pipelines. Apply basic electrical domain knowledge (fault detection, insulator/conductor analysis, etc.) for enriched interpretations. Continuously improve analysis workflows and automate repetitive data processes. Required Skills & Experience 3+ years of hands-on experience as a Data Analyst/Data Scientist. Strong skills in SQL, Python (Pandas, NumPy), or R for data manipulation. Proficiency in data visualization tools : Power BI, Tableau, Looker, etc. Experience in working with time-series data or sensor-based data from industrial sources. Exposure to predictive analytics, ML algorithms, or data modeling techniques. Solid understanding of data pipelines and best practices in data management. Familiarity with AWS/Azure/GCP for data processing is a plus. Background or familiarity with geospatial data or tools like QGIS is a bonus. Preferred Qualifications Degree in Data Science, Engineering, Computer Science, Prior experience with inspection data, IoT, or utilities/power transmission systems. Knowledge of domain-specific platforms used for power line inspections. Certification in data analysis/ML platforms (Google Data Analytics, Microsoft DA, etc. Soft Skills Strong analytical thinking and attention to detail. Ability to convert technical findings into business-focused insights. Team player with cross-functional collaboration experience. Effective written and verbal communication skills (ref:hirist.tech)

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Skills: Python, PyTorch, aws, Data Visualization, Machine Learning, ETL, Experience: 2 4 Years Location: Bangalore (In-office) Employment Type: Full-Time About The Role We are hiring a Junior Data Scientist to join our growing data team in Bangalore. Youll work alongside experienced data professionals to build models, generate insights, and support analytical solutions that solve real business problems. Responsibilities Assist in data cleaning, transformation, and exploratory data analysis (EDA). Develop and test predictive models under guidance from senior team members. Build dashboards and reports to communicate insights to stakeholders. Work with cross-functional teams to implement data-driven initiatives. Stay updated with modern data tools, algorithms, and techniques. Requirements 2 4 years of experience in a data science or analytics role. Proficiency in Python or R, SQL, and key data libraries (Pandas, NumPy, Scikit-learn). Experience with data visualization tools (Matplotlib, Seaborn, Tableau, Power BI). Basic understanding of machine learning algorithms and model evaluation. Strong problem-solving ability and eagerness to learn. Good communication and teamwork skills.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Report this job Job Title: Ecommerce SME Analyst Summary We are seeking an experienced and driven Ecommerce SME Analyst with 8 years of expertise in digital analytics and ecommerce data. In this role, you will analyze clickstream and user behavior data to uncover actionable insights that enhance user experience, optimize conversion funnels, and inform strategic product decisions. You will work extensively with Adobe Analytics, Python, SQL, and BigQuery, and collaborate with cross-functional teams to drive data-informed growth across our ecommerce platform. Key Responsibilities Clickstream %2526 Ecommerce AnalysisAnalyze ecommerce clickstream data using Adobe Analytics to understand user journeys, identify drop-off points, and recommend optimizations for improved engagement and conversion. User Behavior InsightsSegment and analyze user behavior to uncover patterns, preferences, and opportunities for personalization and targeting. Data Extraction %2526 TransformationUse SQL and Python to query, clean, and transform large datasets from BigQuery and other data sources. Visualization %2526 ReportingBuild dashboards and reports using visualization tools (e.g., Tableau, Looker, Power BI) to communicate insights clearly to stakeholders. Product Strategy SupportPartner with product and analytics teams to translate data insights into actionable recommendations that shape the product roadmap. KPI Definition %2526 TrackingDefine and monitor key performance indicators (KPIs) to evaluate the impact of product features and site changes. A/B Testing AnalysisDesign and analyze A/B tests to assess the effectiveness of new features and user experience improvements. Cross-Functional CollaborationWork closely with product managers, marketers, and engineers to understand data needs and deliver timely, relevant insights. Data Quality AssuranceEnsure data accuracy and integrity through validation checks and collaboration with data engineering teams. Continuous LearningStay current with industry trends, tools, and best practices in ecommerce analytics and data science. Required Qualifications Bachelor's degree in Statistics, Mathematics, Computer Science, Economics, or a related field. Minimum 2 years of experience in ecommerce analytics. Strong hands-on experience with Adobe Analytics for tracking and analyzing user behavior. Proficiency in SQL and Python (including libraries like Pandas, NumPy, Matplotlib, Seaborn). Experience working with Google BigQuery or similar cloud-based data warehouses. Familiarity with data visualization tools (e.g., Tableau, Looker, Power BI). Strong analytical and problem-solving skills. Excellent communication skills to present findings to technical and non-technical audiences. Ability to work independently and collaboratively in a fast-paced environment. Key Skills Adobe Analytics Python (Pandas, NumPy, Matplotlib, Seaborn) SQL Google BigQuery Ecommerce Analytics Clickstream %2526 User Behavior Analysis Data Visualization %2526 Reporting A/B Testing Product Strategy %2526 KPI Tracking Communication %2526 Collaboration Data Quality %2526 Validation Key Details Job Function: IT Software : Software Products & Services Industry: IT-Software Specialization:Information Systems Employment Type: Full Time Key Skills Mandatory Skills : Adobe Analytics Python SQL bigquery ecommerce domain About Company Company:LTIMindtree Job Posted by Company LTIMindtree Ltd. LTIMindtree is a global technology consulting and digital solutions company that enables enterprises... More across the industries to reimagine business models, As a digital transformation partner to more than 750 clients, brings extensive domain and technology expertise to help drive superior competitive world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - A Larsen & Toubro Group company - combines the industry - acclaimed strengths or erstwhile Larsen and Toubro Infotech and MindTree in solving the most complex business challenges and delivering transformation at scale. Less Job Id: 71587467

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. The Defender Experts (DEX) Research team is at the forefront of Microsoft’s threat protection strategy, combining world-class hunting expertise with AI-driven analytics to protect customers from advanced cyberattacks. Our mission is to move protection left—disrupting threats early, before damage occurs—by transforming raw signals into intelligence that powers detection, disruption, and customer trust. We’re looking for a passionate and curious Data Scientist to join this high-impact team. In this role, you'll partner with researchers, hunters, and detection engineers to explore attacker behavior, operationalize entity graphs, and develop statistical and ML-driven models that enhance DEX’s detection efficacy. Your work will directly feed into real-time protections used by thousands of enterprises and shape the future of Microsoft Security. This is an opportunity to work on problems that matter—with cutting-edge data, a highly collaborative team, and the scale of Microsoft behind you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Understand complex cybersecurity and business problems, translate them into well-defined data science problems, and build scalable solutions. Design and build robust, large-scale graph structures to model security entities, behaviors, and relationships. Develop and deploy scalable, production-grade AI/ML systems and intelligent agents for real-time threat detection, classification, and response. Collaborate closely with Security Research teams to integrate domain knowledge into data science workflows and enrich model development. Drive end-to-end ML lifecycle: from data ingestion and feature engineering to model development, evaluation, and deployment. Work with large-scale graph data: create, query, and process it efficiently to extract insights and power models. Lead initiatives involving Graph ML, Generative AI, and agent-based systems, driving innovation across threat detection, risk propagation, and incident response. Collaborate closely with engineering and product teams to integrate solutions into production platforms. Mentor junior team members and contribute to strategic decisions around model architecture, evaluation, and deployment. Qualifications Bachelor’s or Master’s degree in Computer Science, Statistics, Applied Mathematics, Data Science, or a related quantitative field 5+ years of experience applying data science or machine learning in a real-world setting, preferably in security, fraud, risk, or anomaly detection Proficiency in Python and/or R, with hands-on experience in data manipulation (e.g., Pandas, NumPy), modeling (e.g., scikit-learn, XGBoost), and visualization (e.g., matplotlib, seaborn) Strong foundation in statistics, probability, and applied machine learning techniques Experience working with large-scale datasets, telemetry, or graph-structured data Ability to clearly communicate technical insights and influence cross-disciplinary teams Demonstrated ability to work independently, take ownership of problems, and drive solutions end-to-end Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

1 - 6 Lacs

Noida

Work from Office

Collaborate with teams to understand business needs, design and implement AI solutions, conduct thorough testing, optimize algorithms, stay updated with AI advancements, integrate technologies, and mentor team for innovation.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Us: Planful is the pioneer of financial performance management cloud software. The Planful platform, which helps businesses drive peak financial performance, is used around the globe to streamline business-wide planning, budgeting, consolidations, reporting, and analytics. Planful empowers finance, accounting, and business users to plan confidently, close faster, and report accurately. More than 1,500 customers, including Bose, Boston Red Sox, Five Guys, Grafton Plc, Gousto, Specialized and Zappos rely on Planful to accelerate cycle times, increase productivity, and improve accuracy. Planful is a private company backed by Vector Capital, a leading global private equity firm. Learn more at planful.com. About the Role: We are looking for self-driven, self-motivated, and passionate technical experts who would love to join us in solving the hardest problems in the EPM space. If you are capable of diving deep into our tech stack to glean through memory allocations, floating point calculations, and data indexing (in addition to many others), come join us. Requirements: 5+ years in a mid-level Python Engineer role, preferably in analytics or fintech. Expert in Python (Flask, Django, pandas, NumPy, SciPy, scikit-learn) with hands-on performance tuning. Familiarity with AI-assisted development tools and IDEs (Cursor, Windsurf) and modern editor integrations (VS Code + Cline). Exposure to libraries supporting time-series forecasting. Proficient in SQL for complex queries on large datasets. Excellent analytical thinking, problem-solving, and communication skills. Nice to have: Shape financial time-series data: outlier detection/handling, missing-value imputation, techniques for small/limited datasets. Profile & optimize Python code (vectorization, multiprocessing, cProfile). Monitor model performance and iterate to improve accuracy. Collaborate with data scientists and stakeholders to integrate solutions. Why Planful Planful exists to enrich the world by helping our customers and our people achieve peak performance. To foster the best in class work we're so proud of, we've created a best in class culture, including: 2 Volunteer days, Birthday PTO, and quarterly company Wellness Days 3 months supply of diapers and meal deliveries for the first month of your Maternity/Paternity leave Annual Planful Palooza, our in-person, company-wide culture Company-wide Mentorship program with Executive sponsorship of CFO and Manager-specific monthly training programs Employee Resource Groups such as Women of Planful, LatinX at Planful, Parents of Planful, and many We encourage our teammates to bring their authentic selves to the team, and have full support in creating new ERGs & communities along the way.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

1 - 6 Lacs

Pune

Work from Office

Return to Work Program for Python Professionals: Location: Offline (Baner, Pune) Experience Required: 3+ years Program Duration: 3 Months Program Type: Free Training + Job Assistance (Not a Job Guarantee) Note: Candidate should be ready to learn new technologies. Restart Your Career in High-Demand Tech Fields! If you've experienced a career gap, layoff, or lost a job due to unforeseen circumstances, VishvaVidya's Return to Work Program offers a unique platform to relaunch your tech career with confidence. What We Offer: Free Technical Training: Upskill in Python, Generative AI, Data Science, and other relevant tools. Placement Assistance: Get connected with top hiring partners actively hiring returnees. Hands-on Learning: Work on real-world projects to bridge your experience gap. Mentorship & Confidence Building: Structured sessions to support your transition back to work. Zero Cost: The program is 100% free, fully sponsored by our hiring partners. Eligibility: Minimum 3 years of prior experience in Python development Career break of 6 months to 7 years welcome Eagerness to upskill and return to the workforce Availability for offline sessions in Baner, Pune Why Join VishvaVidyas Return to Work Program? Tailored for career restart seekers Trusted by top tech employers Industry-relevant curriculum curated by expert mentors Build portfolio-worthy projects and prepare for real-world job roles Why Choose VishvaVidya? We believe in second chances and career growth for everyone. Our fully sponsored program equips you with the skills, confidence, and opportunities needed to successfully re-enter the workforce. Apply Today Your next career chapter starts here!

Posted 3 weeks ago

Apply

2.0 years

3 - 8 Lacs

Delhi

On-site

Coder must have 2 years of experience: Python - primarly for data retrival, numpy,pandas . javascript - framework like react and angular knowledge of GIS- QGIS OR ArcGIS of spatial data . Job Types: Full-time, Permanent Pay: ₹354,035.99 - ₹851,937.64 per year Benefits: Paid sick time Provident Fund Schedule: Day shift Fixed shift Morning shift Work Location: In person Application Deadline: 22/07/2025

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune

On-site

Experience: 5+ Years Employment Type: Full-Time About the Role: We are looking for a highly skilled Data Scientist with a strong background in Machine Learning , Statistical Modeling , and hands-on experience working with Generative AI technologies. The ideal candidate will have deep technical expertise in agentic AI systems , RAG (Retrieval-Augmented Generation) architectures , and the ability to implement, fine-tune, and evaluate large language models such as OpenAI, LLaMA , or Cortex . This is a high-impact role where you'll be building intelligent, scalable, and context-aware AI solutions that solve real-world business problems. Key Responsibilities: Design and implement agentic AI systems that leverage memory, planning, and tool-use capabilities. Develop and deploy RAG-based architectures integrating internal data sources with LLMs to enable knowledge-grounded responses. Apply advanced statistical modeling and machine learning techniques to extract insights and predict outcomes from large datasets. Integrate and fine-tune Generative AI models like OpenAI (GPT), LLaMA, or Cortex for custom use cases. Build intelligent pipelines using Python for data preprocessing, model training, and evaluation. Collaborate cross-functionally with product, engineering, and business teams to drive AI/ML adoption. Ensure scalability, accuracy, and ethical usage of AI models in production environments. Required Skills and Qualifications: Bachelor’s or Master’s in Computer Science, Data Science, Statistics, or related field. 5+ years of experience in ML/AI engineering or data science roles. Strong experience with Python , NumPy, Pandas, Scikit-learn, and ML libraries like TensorFlow or PyTorch. Hands-on with Gen AI platforms such as OpenAI , LLaMA , Anthropic , or Cortex AI . Deep understanding of RAG pipelines , vector databases (e.g., FAISS, Pinecone, Weaviate), and embedding techniques. Experience working on agentic AI frameworks like LangChain, AutoGPT, or OpenAgents. Solid grounding in statistical analysis , A/B testing, and predictive modeling. Familiarity with prompt engineering, fine-tuning, and evaluation metrics for LLMs. Good understanding of data privacy, model bias, and responsible AI practices. Nice to Have: Experience with tools like LangChain , Haystack , or LLM orchestration frameworks . Exposure to cloud platforms (AWS, GCP, Azure) for deploying ML models. Experience working with MLOps pipelines for productionalizing AI solutions. At TulaPi (pronounced tuu-la-pie), we’re building more than just a company – we’re crafting a movement. A movement that’s redefining what’s possible with data, machine learning, and AI, all powered by Snowflake's industry-leading platform. Think of us as the brainy rebels of the data world, bold enough to dream big and skilled enough to make it happen. We’re not just here to follow trends – we’re here to set them. From solving the most complex data challenges to building next-gen ML/AI solutions, we’re going to chart new territory every day. This is where the best talent comes to push boundaries, flex creative muscles, and make a real impact. At Tula Pi, you won’t just be working with cutting-edge tools and technologies – you’ll be shaping the future of what they can do. Whether you’re an architect of the cloud, an engineer with a knack for unlocking AI’s potential, or a strategist ready to disrupt the status quo, we’re looking for trailblazers like you to join our journey. Why Join Us? Big Challenges, Bigger Impact: Work on transformative projects that push the limits of what’s possible in ML/AI. Smart is the Standard: Collaborate with some of the brightest minds in the industry. Global Vision, Local Vibes: Be part of a team that’s global in its ambition but intimate in its culture. ️ Tools of Tomorrow: Gain access to the most advanced data and AI platforms, including Snowflake, and make them dance to your tune. Your Playground: A startup environment where your ideas, creativity, and innovation won’t just be welcomed – they’ll be celebrated. Get the chance to work closely with CEO and CTO with exposure to strategic decision-making. Tulapi is more than a workplace; it’s a destination for those who want their work to matter, their ideas to fly, and their careers to soar. If you're ready to work hard, dream bigger, and redefine the future of ML/AI, welcome home. Website: Tulapi.ai LinkedIn: https://www.linkedin.com/company/tulapi-ai/ Data fortune Software Solution is a 12+ year old company Based out of Pune . Our Head Office is in Atlanta , Georgia , US . we are around 150+ We work with US clients . Enterprise Data Management:-> Data Engineer, Snowflake, Azure, Power Bi , Tableau, SQL Server, SQL Server DBA. Application side - > Python, Dot Net , Angular, Flutter, Node, React , PHP, Vue JS , Java script , Flutter, Automation testing , Selenium , Load testing , etc. Website: https://datafortune.com/ LinkedIn: https://www.linkedin.com/company/datafortune/posts/?feedView=all

Posted 3 weeks ago

Apply

3.0 years

2 - 13 Lacs

India

On-site

Role Overview We are seeking a skilled and self-motivated AI/ML Engineer to join our growing team. You will be responsible for designing, developing, training, and deploying machine learning models and AI systems to solve practical, real-life problems. Key Responsibilities Design and implement machine learning models for real-world business applications. Analyze and preprocess large datasets from various structured and unstructured sources. Train, validate, and fine-tune models using classical ML and/or deep learning methods. Collaborate with product and engineering teams to integrate models into production systems. Build end-to-end ML pipelines (data ingestion → model training → deployment). Monitor and improve model performance over time with live feedback data. Document model architecture, performance metrics, and deployment processes. Required Skills and Experience 3–5 years of hands-on experience in AI/ML engineering. Strong knowledge of machine learning algorithms (classification, regression, clustering, etc.). Experience with deep learning frameworks (TensorFlow, PyTorch, Keras). Proficient in Python, with experience in libraries like scikit-learn, pandas, NumPy, etc. Experience with NLP, computer vision, or time-series models is a plus. Understanding of MLOps practices and tools (MLflow, DVC, Docker, etc.). Exposure to deploying ML models via REST APIs or cloud services (AWS/GCP/Azure). Familiarity with data versioning, model monitoring, and re-training workflows. Preferred Qualifications Bachelor's or Master's degree in Computer Science, Data Science, AI, or a related field. Published work on GitHub or contributions to open-source AI/ML projects. Certifications in AI/ML, cloud computing, or data engineering. Contact US Email: careers@crestclimbers.com Phone: +91 94453 30496 Website: www.crestclimbers.com Office: Kodambakkam, Chennai Job Types: Full-time, Permanent Schedule: Day shift Work Location: In person Job Types: Full-time, Permanent Pay: ₹298,197.62 - ₹1,398,461.03 per year Work Location: In person Expected Start Date: 21/07/2025

Posted 3 weeks ago

Apply

8.0 years

4 - 8 Lacs

Bengaluru

On-site

Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Roles and responsibilities The ideal candidate should have strong communication skills to effectively engage with both technical and non-technical stakeholdersThe candidate will be responsible for developing end to end task/workflow automation pipeline using Python using wide range of data sources (both on premise and cloud)Candidate should have strong working experience is transforming excel based manual processes to fully automated python based process incorporating strong governance around itThe person should be competent in Python Programming and possess high levels of analytical skills - Data pre-processing, engineering, data pipeline development and automation In-depth knowledge of libraries such as pandas, numpy, scikit-learn, openpyxl, pyxlsb, TensorFlow, PyTorch etc.Well-versed with Python coding standards and formatting conventions to ensure maintainable, scalable, and reusable modules.Build and automate workflows using Microsoft Power Platform (Power BI, Power Apps, Power Automate).integrating systems and automating workflows using WTW Unify Knowledge of Dataiku is a plus.Apply GenAI techniques to enhance data exploration, automate content generation, and support decision-making.Ensure data quality, governance, and compliance with organizational standards.Well-versed with CI/CD pipelines using Azure DevOps (ADO) for seamless deployment and integration of data science solutions.Experience working on Posit Workbench, Posit Connect will be an added advantageStay updated with the latest trends in AI, machine learning, and data engineering.Tools/Tech experience – Mandatory – Python (Data processing, Engineering & Automation), SQL, Proficiency with version control systems like ADO/BitbucketPreferred - R programming, Posit Workbench, R Shiny Experience processing large amount of data using BigData technologies is preferredFamiliarity with Microsoft Power Platform tools.Knowledge of Dataiku is a plus.Familiarity with WTW Unify platform and its applications in analytics.Knowledge of Generative AI models and frameworks (e.g., GPT, DALL·E, Llama).Knowledge of data visualization tools and techniques is a plus Functional/Other expertiseRelevant experience: 8+ years of experience using Python programming Language for end-to-end data pre-processing, transformation and automationExperience in the Insurance domain preferred (for e.g. Finance, Actuarial) Qualifications Educational Qualification: Masters in Statistics/Mathematics/Economics/Econometrics from Tier 1 institutions Or BE/B-Tech, MCA or MBA from Tier 1 institutions

Posted 3 weeks ago

Apply

0 years

4 - 7 Lacs

Surat

On-site

Job description Primary role Writing efficient, reusable, testable, and scalable code. Developing - backend components to enhance performance and receptiveness, server - side logic and platform, statistical learning models. Integrate user-facing elements into applications. Improve functionality of existing systems. Working with python libraries like pandas, Numpy, etc. Creating models for AI and ML - based features. Coordinate with internal teams to understand user requirements and provide technical solutions. Job Overview (8098) Experience 30 Month(s). City Surat. Qualification M.SC,MCA,PGDCA Area of Expertise PYTHON Prefer Gender Male Function AI & ML Audio / Video Profile NA

Posted 3 weeks ago

Apply

3.0 years

4 - 10 Lacs

Noida

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior Python Developer – Client – Merck, H and M, TIAA, MINT Job Description Bachelor’s or master’s degree with 3+ years of strong Python development experience Design, develop, and maintain high-performance web applications using Python and related frameworks. Strong understanding of Python OOPs, Data type, Data Structure and algorithm , Exception handling, Decarotor , Generator, Iterator, Automation. Strong understanding of Python Libraries (Pandas, TensorFlow, Numpy, SciPy) Experience in Cloud Azure / AWS Develop, optimize, and manage complex APIs (RESTful or GraphQL). Collaborate with cross-functional teams to define, design, and ship new features. Troubleshoot and resolve advanced technical issues in development and production environments. Conduct technical evaluations of new tools and frameworks, recommending their adoption when appropriate. Stay ahead of emerging trends in Python development, ensuring the team remains at the forefront of innovation. Advanced proficiency in Python and frameworks like Django,Flask, or FastAPI. Good understanding of Database Postgres / MySQL & ORM Library i.e. SQL Alchemy/ any ORM libraries Understanding of Code Repository tools i.e. GIT , SVN Strong understanding of DevOps principles(Docker, Kubernetes and microservices) EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role Being a member of the Data Services Platform Delivery team means you will be a part of a technology team with a rich diverse skill sets and a phenomenal hard-working committed team. Whether it’s Snowflake, Java, Spring suite, Python, data analytics, Unix, cloud computing or Database skillset required for the project initiatives, we are there for each other collaborating and helping each other to achieve the common goal. We are embarking on an incredible multi-year Data Transformation journey, and we are looking for best-of-breed software engineers to join us on this journey. We’re looking for a passionate engineer to help design and build platforms that power the next generation of data products. In this role you will be responsible for building platforms for next generation Data Products. You’ll work within the Data Platform Squad to develop secure, resilient, scalable solutions in Snowflake, Java or Python delivered to the marketplace via multiple delivery mechanisms. The Solution will be built with latest and greatest cloud tools and industry standards. This role offers strong opportunities for growth driven by your performance and contributions to our strategic goals. Qualifications Minimum 10 years of related experience Bachelor's degree (preferred) or equivalent experience Primary Responsibilities. Act as a technical expert on the development of one or more applications including design and develop robust, scalable platforms that enable transformation of data into a useful format for analysis, enhance data flow, and enable efficient consumption and analysis of data. Partner with enterprise teams to identify and deploy efficient hosting environments. Research and evaluate technical solutions consistent with DTCC technology standards. Contribute expertise to the design of components or individual programs and participate in the unit and functional testing. Collaborate with teams across the software development lifecycle, including those responsible for testing, troubleshooting, operations and production support. Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Write complex performance optimal SQL queries against Snowflake. Convert logical data models to physical data models, DDL, roles and views and enhance them as required. Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions. Ensure operational readiness of the services and meet the commitments to our customers regarding reliability, availability, and performance. Be responsible for the technical quality of the projects by ensuring that key technical procedures, standards, quality control mechanisms, and tools are properly used including performing root cause analyses for technical problems and conduct quality review. Work across functions and across teams - we don’t only work on code that we own; we work with other parts of successful delivery of data products every day. Talents Needed For Success We recognize that expertise in software development can be gained through many different paths. Below are the key skills we value for this role—not all are required, but the ones you bring should be demonstrated at an exceptional level to succeed in this position. Application development in Java and related technologies Java, J2EE, Spring (Boot, Batch, Core, MVC, JDBC,), Junit, AWS SDKs AND /OR Python, Polars/ Pandas, Snowpark, NumPy, SciPy, AWS SDKs, pytest static analyzers Sonar /Fortify with gating for code quality. Hands-on experience with databases architecture, import, export, performance techniques, data model, database table design and writing complex SQL queries. Solid Understanding of Unix/Linux OS including shell scripting, perl and/or python Solid understanding of Agile, CI/CD, Jenkins, Dev/Ops practices and tools like Maven, Jenkins, nexus, fortify, liquibase, etc. Exposure to design & architecture will be a plus Demonstrates strong analytical and interpersonal skills Experienced in working with a geographically separated (onshore + offshore) team Must understand the Agile development process and be committed to delivering assignments as planned and agreed. Ability to collaborate effectively with other developers and co-workers including distributed team members. Strong communication skills, desire to learn and contribute, self-starter and phenomenal teammate. Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions. Nice to have Proven background in database concepts – data management, governance, modelling, and development. Snowflake Architecture, Snow SQL, Snowpark, Snow Pipe, Tasks, Streams, Dynamic Tables, Time travel, Optimizer, data sharing, and stored procedures. Design Patterns in Java/ Python, Cloud Design Pattern Time Series Analysis for financial data Experience with any BI tools such as QuickSight, Looker, PowerBI is a plus. Familiarity with container technologies like Docker, Kubernetes, OpenShift will be a plus. Proven understanding of Agile, CI/CD, Dev/Ops practices and tools. AWS experience Excellent oral and written English Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. About Us With over 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency, enhancing performance and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm innovates purposefully, simplifying the complexities of clearing, settlement, asset servicing, transaction processing, trade reporting and data services across asset classes, bringing enhanced resilience and soundness to existing financial markets while advancing the digital asset ecosystem. In 2024, DTCC’s subsidiaries processed securities transactions valued at U.S. $3.7 quadrillion and its depository subsidiary provided custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $99 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 25 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn , X , YouTube , Facebook and Instagram . DTCC proudly supports Flexible Work Arrangements favoring openness and gives people freedom to do their jobs well, by encouraging diverse opinions and emphasizing teamwork. When you join our team, you’ll have an opportunity to make meaningful contributions at a company that is recognized as a thought leader in both the financial services and technology industries. A DTCC career is more than a good way to earn a living. It’s the chance to make a difference at a company that’s truly one of a kind. Learn more about Clearance and Settlement by clicking here . About The Team IT Architecture and Enterprise Services are responsible for enabling digital transformation of DTCC. The group manages complexity of the technology landscape within DTCC and enhances agility, robustness and security of the technology footprint. It does so by serving as the focal point for all technology architectural activities in the organization as well as engineering a portfolio of foundational technology assets to enable our digital transformation.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

India

Remote

Location: Remote / Hybrid Experience: 2–6 years (or strong project/internship experience) Employment Type: Full-Time Department: AI & Software Systems Key Responsibilities Design and maintain end-to-end MLOps pipelines: from data ingestion to model deployment and monitoring. Containerize ML models and services using Docker for scalable deployment. Develop and deploy APIs using FastAPI to serve real-time inference for object detection, segmentation, and mapping tasks. Automate workflows using CI/CD tools like GitHub Actions or Jenkins. Manage cloud infrastructure on AWS: EC2, S3, Lambda, SageMaker, CloudWatch, etc. Collaborate with AI and GIS teams to integrate ML outputs into mapping dashboards. Implement model versioning using DVC/Git, and maintain structured experiment tracking using MLflow or Weights & Biases. Ensure secure, scalable, and cost-efficient model hosting and API access. Required Skills Programming: Python (must), Bash/Shell scripting ML Frameworks: PyTorch, TensorFlow, OpenCV MLOps Tools: MLflow, DVC, GitHub Actions, Docker (must), Kubernetes (preferred) Cloud Platforms: AWS (EC2, S3, SageMaker, IAM, Lambda) API Development: FastAPI (must), Flask (optional) Data Handling: NumPy, Pandas, GDAL, Rasterio Monitoring: Prometheus, Grafana, AWS CloudWatch Preferred Experience Hands-on with AI/ML models for image segmentation, object detection (YOLOv8, U-Net, Mask R-CNN). Experience with geospatial datasets (satellite imagery, drone footage, LiDAR). Familiarity with PostGIS, QGIS, or spatial database management. Exposure to DevOps principles and container orchestration (Kubernetes/EKS). Soft Skills Problem-solving mindset with a system design approach. Clear communication across AI, software, and domain teams. Ownership of the full AI deployment lifecycle. Education Bachelor’s or Master’s in Computer Science, Data Science, AI, or equivalent. Certifications in AWS, MLOps, or Docker/Kubernetes (bonus).

Posted 3 weeks ago

Apply

8.0 - 11.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Roles & Responsibilities Key Responsibilities Design, develop, and optimize Machine Learning & Deep Learning models using Python and libraries such as TensorFlow, PyTorch, and Scikit-learn Work with Large Language Models (e.g., GPT, BERT, T5) to solve NLP tasks such as, semantic search, summarization, chatbots, conversational agents, and document intelligence. Lead the development of scalable AI solution including data preprocessing, embedding generation, vector search, and prompt orchestration. Build and manage vector databases and metadata stores to support high-performance semantic retrieval and contextual memory. Implement caching, queuing, and background processing systems to ensure performance and reliability at scale. Conduct independent R&D to implement cutting-edge AI methodologies, evaluate open-source innovations, and prototype experimental solutions Apply predictive analytics and statistical techniques to mine actionable insights from structured and unstructured data. Build and maintain robust data pipelines and infrastructure for end-to-end ML model training, testing, and deployment Collaborate with cross-functional teams to integrate AI solutions into business processes Contribute to the MLOps lifecycle, including model versioning, CI/CD, performance monitoring, retraining strategies, and deployment automation Stay updated with the latest developments in AI/ML by reading academic papers, and experimenting with novel tools or frameworks Required Skills & Qualifications Proficient in Python, with hands-on experience in key ML libraries: TensorFlow, PyTorch, Scikit-learn, and HuggingFace Transformers Strong understanding of machine learning fundamentals, deep learning architectures (CNNs, RNNs, transformers), and statistical modeling Practical experience working with and fine-tuning LLMs and foundation models Deep understanding of vector search, embeddings, and semantic retrieval techniques. Expertise in predictive modeling, including regression, classification, time series, clustering, and anomaly detection Comfortable working with large-scale datasets using Pandas, NumPy, SciPy etc. Experience with cloud platforms (AWS, GCP, or Azure) for training and deployment is a plus Preferred Qualifications Master’s or Ph.D. in Computer Science, Machine Learning, Data Science, or related technical discipline. Experience with MLOps tools and workflows (e.g., Docker, Kubernetes, MLflow, SageMaker, Vertex AI). Ability to build and expose APIs for models using FastAPI, Flask, or similar frameworks. Familiarity with data visualization (Matplotlib, Seaborn) and dashboarding (Plotly) tools or equivalent Working knowledge of version control, experiment tracking, and team collaboration Experience 8-11 Years Skills Primary Skill: AI/ML Development Sub Skill(s): AI/ML Development Additional Skill(s): TensorFlow, NLP, Pytorch, Large Language Models (LLM) About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced Core Python Developer for an immediate on-site opportunity in Hyderabad. This role involves working with cutting-edge technologies in AI/ML, cloud platforms, and Python-based frameworks, delivering scalable solutions for our client. About the Role This role involves working with cutting-edge technologies in AI/ML, cloud platforms, and Python-based frameworks, delivering scalable solutions for our client. Responsibilities Develop and maintain robust applications using Python frameworks (Django, Flask, or Pyramid). Design and implement data pipelines and workflows utilizing Numpy, Scipy, Pandas, Dask, and other advanced libraries. Work with NLP libraries (spaCy, NLTK) and machine learning frameworks (scikit-learn, PyTorch). Develop, test, and maintain RESTful APIs. Collaborate in an Agile development environment, following best coding and architectural practices. Integrate with SQL/NoSQL databases for optimal data management. Implement solutions on cloud platforms such as AWS, Google Cloud, or Azure. Use version control and collaborative workflows with Git. Build and optimize AI/ML pipelines, potentially using Langchain or similar tools. Required Skills Core Python Development, AI/ML technologies, Cloud platforms, Python frameworks, RESTful APIs, Agile development, SQL/NoSQL databases, Version control with Git. Preferred Skills Experience with NLP libraries, machine learning frameworks, and building AI/ML pipelines.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an AI Developer with 5-8 years of experience, you will be based in Pune with a hybrid working model. You should be able to join immediately or within 15 days. Your primary responsibility will be to develop and maintain Python applications, focusing on API building, data processing, and transformation. You will utilize Lang Graph to design and manage complex language model workflows and work with machine learning and text processing libraries to deploy agents. Your must-have skills include proficiency in Python programming with a strong understanding of object-oriented programming concepts. You should have extensive experience with data manipulation libraries like Pandas and NumPy to ensure clean, efficient, and maintainable code. Additionally, you will develop and maintain real-time data pipelines and microservices to ensure seamless data flow and integration across systems. When it comes to SQL, you are expected to have a strong understanding of basic SQL query syntax, including joins, WHERE, and GROUP BY clauses. Good-to-have skills include practical experience in AI development applications, knowledge of parallel processing and multi-threading/multi-processing to optimize data fetching and execution times, familiarity with SQLAlchemy or similar libraries for data fetching, and experience with AWS cloud services such as EC2, EKS, Lambda, and Postgres. If you are looking to work in a dynamic environment where you can apply your skills in Python, SQL, Pandas, NumPy, Agentic AI development, CI/CD pipelines, AWS, and Generative AI, this role might be the perfect fit for you.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Software Test Engineer at Trading Technologies, you will play a crucial role in ensuring the quality and functionality of our cutting-edge trading applications. Your main responsibilities will include designing, developing, and executing test plans and test cases based on software requirements and technical design specifications. You will collaborate with the Development team to investigate and debug software issues, recommend product improvements to the Product Management team, and constantly enhance your skills alongside a team of testers. Your expertise in testing multi-asset trade analytics applications, automated testing using Python or similar programming languages, and experience with cloud-based systems like AWS will be invaluable in this role. Knowledge of trade analytics standards such as pre- and post-Trade TCA, SEC & FINRA rule compliance, MiFID II, and PRIIPs analytics will be highly advantageous. Additionally, your understanding of performance and load testing for SQL queries and data pipelines will be essential. At Trading Technologies, we offer a competitive benefits package to support your well-being and growth. You will have access to medical, dental, and vision coverage, generous paid time off, parental leave, professional development opportunities, and wellness perks. Our hybrid work model allows for a balance between in-office collaboration and remote work, fostering team cohesion, innovation, and mentorship opportunities. Join our forward-thinking and inclusive culture that values diversity and promotes collaborative teamwork. Trading Technologies is a leading Software-as-a-Service (SaaS) technology platform provider in the global capital markets industry. Our TT platform connects to major international exchanges and liquidity venues, offering advanced tools for trade execution, order management, market data solutions, risk management, and more to a diverse client base. Join us in shaping the future of trading technology and delivering innovative solutions to market participants worldwide.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a data science expert, you will be responsible for developing strategies and solutions to address various problems using cutting-edge machine learning, deep learning, and GEN AI techniques. Your role will involve leading a team of data scientists to ensure timely and high-quality delivery of project outcomes. You will analyze large and complex datasets across different domains, perform exploratory data analysis, and select features to build and optimize classifiers and regressors. Enhancing data collection procedures, ensuring data quality and accuracy, and presenting analytical results to technical and non-technical stakeholders will be key aspects of your job. You will create custom reports and presentations with strong data visualization skills to effectively communicate analytical conclusions to senior company officials and other stakeholders. Proficiency in data mining, EDA, feature selection, model building, and optimization using machine learning and deep learning techniques is essential. Your primary skills should include a deep understanding and hands-on experience with data science and machine learning techniques, algorithms for supervised and unsupervised problems, NLP, computer vision, and GEN AI. You should also have expertise in building deep learning models for text and image analytics using frameworks like ANNs, CNNs, LSTM, Transfer Learning, Encoder, and decoder. Proficiency in programming languages such as Python, R, and common data science tools like NumPy, Pandas, Matplotlib, and frameworks like TensorFlow, Keras, PyTorch, XGBoost is required. Experience with statistical inference, hypothesis testing, and cloud platforms like Azure/AWS, as well as deploying models in production, will be beneficial for this role. Excellent communication and interpersonal skills are necessary to convey complex analytical concepts to diverse stakeholders effectively.,

Posted 3 weeks ago

Apply

58.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description We are seeking a high-impact AI/ML Engineer to lead the design, development, and deployment of machine learning and AI solutions across vision, audio, and language modalities. You'll be part of a fast-paced, outcome-oriented AI & Analytics team, working alongside data scientists, engineers, and product leaders to transform business use cases into real-time, scalable AI systems. Responsibilities This role demands strong technical leadership, a product mindset, and hands-on expertise in Computer Vision, Audio Intelligence, and Deep Responsibilities : Architect, develop, and deploy ML models for multimodal problems, including vision (image/video), audio (speech/sound), and NLP tasks. Own the complete ML lifecycle: data ingestion, model development, experimentation, evaluation, deployment, and monitoring. Leverage transfer learning, foundation models, or self-supervised approaches where suitable. Design and implement scalable training pipelines and inference APIs using frameworks like PyTorch or TensorFlow. Collaborate with MLOps, data engineering, and DevOps to productionize models using Docker, Kubernetes, or serverless infrastructure. Continuously monitor model performance and implement retraining workflows to ensure accuracy over time. Stay ahead of the curve on cutting-edge AI research (e.g., generative AI, video understanding, audio embeddings) and incorporate innovations into production systems. Write clean, well-documented, and reusable code to support agile experimentation and long-term platform : Bachelors or Masters degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 58 years of experience in AI/ML Engineering, with at least 3 years in applied deep learning. Technical Skills Languages : Expert in Python; good knowledge of R or Java is a plus. ML/DL Frameworks : Proficient with PyTorch, TensorFlow, Scikit-learn, ONNX. Computer Vision : Image classification, object detection, OCR, segmentation, tracking (YOLO, Detectron2, OpenCV, MediaPipe). Audio AI : Speech recognition (ASR), sound classification, audio embedding models (Wav2Vec2, Whisper, etc. Data Engineering : Strong with Pandas, NumPy, SQL, and preprocessing pipelines for structured and unstructured data. NLP/LLMs : Working knowledge of Transformers, BERT/LLAMA, Hugging Face ecosystem is preferred. Cloud & MLOps : Experience with AWS/GCP/Azure, MLFlow, SageMaker, Vertex AI, or Azure ML. Deployment & Infrastructure : Experience with Docker, Kubernetes, REST APIs, serverless ML inference. CI/CD & Version Control : Git, DVC, ML pipelines, Jenkins, Airflow, etc. Soft Skills & Competencies Strong analytical and systems thinking; able to break down business problems into ML components. Excellent communication skills able to explain models, results, and decisions to non-technical stakeholders. Proven ability to work cross-functionally with designers, engineers, product managers, and analysts. Demonstrated bias for action, rapid experimentation, and iterative delivery of impact. (ref:hirist.tech)

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Proficiency with Python (Pandas, NumPy), SQL, and Java. Experience with LLMs, LangChain, and Generative AI technologies. Familiarity with ML frameworks (TensorFlow, PyTorch) and data engineering tools (Spark, Kafka). Microservices, CI CD, ML Strong data analysis skills and ability to present findings to both technical and non-technical stakeholders. Proficient understanding of key data engineering concepts, such as data lakes, columnar formats, ETL tools, and BI tools. Knowledge in Machine Learning, NLP, Recommender systems, personalization, Segmentation, microservices architecture and API development. Ability to adapt to a fast-paced, dynamic work environment and learn new technologies quickly. Work in a team/ Independently. Excellent Written & Verbal Communication Skills Solid critical thinking and questioning skills. High degree of flexibility - willing to fill in the gaps rather than relying on others Strong communication skills, especially in presenting data insights. Flexibility, problem-solving, and a proactive approach in a fast-paced environment (ref:hirist.tech)

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

We are looking for an experienced Django Rest Framework Developer to contribute to the enhancement and development of backend API services for a large-scale ERP software utilizing a microservices architecture. Your proficiency in Python, Django, Django Rest Framework, and relational databases such as PostgreSQL will be crucial for this role. The primary responsibility involves designing and implementing efficient, scalable, and secure APIs while leveraging tools like Celery, Kafka, Redis, Django Channels, Pandas, and NumPy. An in-depth understanding of ERP systems is essential as you will be working on modules that drive business-critical operations. Key Responsibilities: - Design, develop, and maintain backend APIs using Django Rest Framework for a large-scale ERP system. - Architect and implement a microservices architecture to ensure decoupled, scalable, and efficient backend services. - Integrate PostgreSQL databases for storing ERP data, focusing on data integrity and query optimization. - Implement background tasks and scheduling using Celery and Celery Beat for managing asynchronous workflows. - Leverage Kafka for messaging and event-driven architecture to enable reliable communication between microservices. - Utilize Redis for caching, session management, and API performance optimization. - Develop real-time communication features through Django Channels to handle WebSockets and async functionalities. - Manage data pipelines and perform data transformations using Pandas and NumPy. - Write clean, maintainable, and well-documented code following security and API design best practices. - Collaborate with frontend teams, database administrators, and DevOps engineers for seamless deployment and integration of services. - Troubleshoot and optimize API performance to enhance the efficiency of backend operations. - Participate in code reviews, testing, and documentation to ensure high-quality software delivery. - Stay updated on emerging technologies and industry trends relevant to ERP and backend development. Required Skills & Qualifications: - 3+ years of backend development experience using Django and Django Rest Framework. - Strong Python proficiency and familiarity with microservices architecture. - Extensive experience with PostgreSQL or other relational databases, including optimized query writing and database management. - Experience in handling asynchronous tasks with Celery and Celery Beat. - Familiarity with Kafka for building event-driven systems and inter-service communication. - Expertise in Redis for caching, pub/sub messaging, and system performance enhancement. - Hands-on experience with Django Channels for real-time communication and WebSocket management. - Proficiency in Pandas and NumPy for data processing, manipulation, and analysis. - Understanding of ERP systems and their modules to build relevant APIs efficiently. - Knowledge of RESTful API design principles, security best practices, and scalability patterns. - Proficiency in Docker and containerized deployments for both development and production environments. - Experience with Git and collaborative development workflows. - Strong problem-solving skills, debugging, and backend issue troubleshooting. - Experience with CI/CD pipelines for automated testing and deployment. - Familiarity with Kubernetes for managing containerized applications in production. - Knowledge of GraphQL for building flexible APIs. - Previous experience working on ERP software or other large-scale enterprise applications. Job Type: Full-time Location Type: In-person Schedule: Fixed shift Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): - Have you worked on ERPs before (Yes/No) - Have you led backend teams or managed end-to-end backend projects, from architecture to deployment (Yes/No) Experience: - Python: 3 years (Required) - Django: 3 years (Required) Work Location: In person,

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

You have a solid working experience in Python-based Django and Flask frameworks, along with expertise in developing microservices based design and architecture. Your strong programming knowledge extends to Javascript, HTML5, Python, Restful API, and gRPC API. You have hands-on experience with object-oriented concepts in Python and are familiar with libraries like Numpy, Pandas, Ppen3D, OpenCV, and Matplotlib. Additionally, you possess knowledge of MySQL, Postgres, and MSSQL databases, as well as 3D geometry. Your expertise also includes familiarity with SSO/OpenID Connect/OAuth authentication protocols, version control systems like GitHub/BitBucket/GitLab, and continuous integration and continuous deployment (CI/CD) pipelines. You have a basic understanding of image processing, data analysis, and data science, coupled with strong communication skills and analytical thinking capabilities from various perspectives. As a proactive team player, you are inclined towards providing new ideas, suggestions, solutions, and constructive analysis of your team members" ideas. You thrive in a fast-paced, Agile software development environment and have a good-to-have knowledge of other programming languages like C, C++, basics of machine learning, exposure to NoSQL databases, and cloud platforms like GCP/AWS/Azure. In the area of Software Engineering, you apply scientific methods to analyze and solve software engineering problems, develop and apply software engineering practices and knowledge, and exercise original thought and judgement. You are responsible for supervising the technical and administrative work of other software engineers, enhancing your skills and expertise within the software engineering discipline. Working collaboratively with other software engineers and stakeholders, you contribute positively to project performance and make informed decisions based on situational understanding. With more than a year of relevant work experience, you possess a solid understanding of programming concepts, software design, and software development principles. You consistently deliver accurate and reliable results with minimal supervision, work on various tasks and problems, and demonstrate the application of your skills and knowledge effectively. By organizing your time efficiently to meet task deadlines, collaborating with team members to achieve common goals, and making decisions based on understanding rather than just rules, you have a direct and positive impact on project performance.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies