Jobs
Interviews

113 Jupyter Notebook Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Bitkraft Technologies LLP is searching for a skilled React Native Developer to become a valuable member of our software engineering team. You will primarily focus on utilizing the ReactJS framework to develop web projects for our custom services business. As a React Native Developer, having proficiency in other JavaScript-related frontend frameworks and third-party libraries is crucial. A strong passion for ensuring a top-notch user experience, along with an eye for high-quality visual design, will be highly beneficial. If you are adept at problem-solving, enjoy collaborative teamwork, and thrive in a fast-paced environment that presents core technical and business challenges, we are eager to meet you. You should possess intermediate to expert level experience in the following technologies: - JavaScript, HTML5, and CSS - React Native Additionally, the following skills and requirements are essential: - Meticulous attention to detail - Experience working in Agile projects, including familiarity with tools like Jira - Willingness to adapt to varying technologies, platforms, and frameworks for future projects - Strong work ethic and dedication to meeting deadlines while supporting team members in achieving goals - Flexibility to collaborate across different time zones with overseas clients if necessary Desirable skills that would be advantageous include knowledge in the following areas: - Frontend: Angular, Vuejs - Backend: NodeJs, Python (Django/Flask/Jupyter Notebook), PHP (Yii2/Wordpress/Magento) - Databases: MySQL, PostgreSQL, MongoDB, Graph Databases, Oracle - Cloud Infrastructure: AWS, Azure, Google Cloud / Firebase - Mobile Technologies: Native Android, Native iOS, Hybrid Mobile App Development (Ionic/Flutter/React Native) Key Responsibilities: - Understanding and gathering client requirements, offering technical solutions to address business needs - Utilizing technology to solve business problems effectively - Timely delivery of work meeting quality standards - Fostering the development of reusable code - Addressing technical queries from clients, management, and team members - Evaluating existing technical architecture and suggesting enhancements - Collaborating with development teams and product managers to conceptualize software solutions - Designing client-side and server-side architecture - Conducting unit testing and scenario-based testing to develop robust systems - Troubleshooting, debugging, and enhancing existing applications - Implementing security and data protection mechanisms to ensure application security - Creating technical documentation, flow diagrams, use-case diagrams, and charts as required - Effective communication with team members and clients for streamlined project operation - Keeping abreast of the latest advancements in web technologies and related programming languages/frameworks Experience: 5 to 8 years Job Location: Fort, Mumbai Why Join Bitkraft: - Your input and opinions are highly valued - Exposure to cutting-edge technologies - Direct collaboration with client teams - International project exposure - Opportunity to grasp the holistic view of projects - Dynamic environment with expedited project completions - Autonomy in managing your time for efficient work - Supportive and amiable work environment About Bitkraft Technologies LLP: Bitkraft Technologies LLP is an award-winning Software Engineering Consultancy that specializes in Enterprise Software Solutions, Mobile Apps Development, ML/AI Solution Engineering, Extended Reality, Managed Cloud Services, and Technology Skill-sourcing, with an exceptional track record. Driven by technology, we strive to push boundaries to fulfill the business requirements of our clients. Committed to delivering top-notch products, we take pride in developing robust, user-centric solutions that meet business demands. With clients spanning over 10+ countries such as the US, UK, UAE, Oman, Australia, and India, Bitkraft Technologies LLP has established a strong presence in the global market.,

Posted 1 week ago

Apply

12.0 - 21.0 years

8 - 12 Lacs

Chennai

Work from Office

Project Overview The candidate will be working on the Model Development as a Service (MDaaS) initiative, Which focuses on scaling machine learning techniques for exception classification, early warning signals, Data quality control, model surveillance, and missing value imputation. The project involves applying advanced ML techniques to large datasets and integrating them into financial analytics systems. Key Responsibilities Set up Data Pipelines: Configure storage in cloud-based compute environments and repositories for large-scale data ingestion and processing. Develop and Optimize Machine Learning Models: Implement Machine Learning for Exception Classification (MLEC) to classify financial exceptions. Conduct Missing Value Imputation using statistical and ML-based techniques. Develop Early Warning Signals for detecting anomalies in multi-variate/univariate time-series financial data. Build Model Surveillance frameworks to monitor financial models. Apply Unsupervised Clustering techniques for market segmentation in securities lending. Develop Advanced Data Quality Control frameworks using TensorFlow-based validation techniques. Experimentation & Validation: Evaluate ML algorithms using cross-validation and performance metrics. Implement data science best practices and document findings. Data Quality and Governance: Develop QC mechanisms to ensure high-quality data processing and model outputs. Required Skillset Strong expertise in Machine Learning & AI (Supervised & Unsupervised Learning). Proficiency in Python, TensorFlow, SQL, and Jupyter Notebooks. Deep understanding of time-series modeling, anomaly detection, and risk analytics. Experience with big data processing and financial data pipelines. Ability to deploy scalable ML models in a cloud environment. Deliverables & Timeline Machine Learning for Exception Classification (MLEC): Working codes & documentation Missing Value Imputation: Implementation & validation reports Early Warning Signals: Data onboarding & anomaly detection models Model Surveillance: Fully documented monitoring framework Securities Lending: Clustering algorithms for financial markets Advanced Data QC: Development of a general-purpose QC library Preferred Qualifications Prior experience in investment banking, asset management, or trading desks. Strong foundation in quantitative finance and financial modeling. Hands-on experience with TensorFlow, PyTorch, and AWS/GCP AI services

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

chandigarh

On-site

You should have the ability to work with large data sets using Python and possess a strong background in Python programming, MySQL, Linux, and Machine Learning. It is essential to have basic knowledge of other relevant technologies such as Cloud computing, Jupyter Notebook, Machine learning algorithms, TensorFlow, PyTorch, Tesseract OCR, and NLP techniques including models like SpaCy, Bert, NLTK, and AutoML. You will be required to work through the phases of the project. This is a full-time job that requires you to work in person. The interview will be conducted in person, and there are no telephonic interviews scheduled. The job follows a Day shift and Morning shift schedule. A Bachelor's degree is preferred for this role. Additionally, having 1 year of experience in C++ is required for this position.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Job Description As a Big Data Engineer with Capco, a Wipro company, you will play a crucial role in leveraging your skills to drive innovative solutions for our clients in the banking, financial, and energy sectors. Your expertise in messaging technologies such as Apache Kafka, programming languages like Scala and Python, and tools like NiFi and AirFlow will be essential in designing and implementing intuitive and responsive user interfaces that enhance data analysis capabilities. You will be responsible for writing efficient queries using Jupyter Notebook, optimizing Spark performance, and ensuring the reliability and scalability of distributed systems. Your strong understanding of cloud architecture, SQL, and software engineering concepts will enable you to deliver high-quality code that meets performance standards. At Capco, we value diversity, inclusivity, and creativity, and believe that different perspectives contribute to our competitive advantage. With no forced hierarchy, you will have the opportunity to advance your career and make a significant impact on our clients" businesses. Join us at Capco and be part of a dynamic team that is driving transformation in the energy and financial services industries.,

Posted 2 weeks ago

Apply

7.0 - 10.0 years

27 - 32 Lacs

Pune

Hybrid

Job Title: Big Data Developer Job Location: Pune Experience : 7+ Years Job Type: Hybrid. Strong skills in - Messaging Technologies like Apache Kafka or equivalent, Programming skill Scala, Spark with optimization techniques, Python Should be able to write the query through Jupyter Notebook Orchestration tool like NiFi, Airflow Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Experience with SQL & Distributed Systems. Strong understanding of Cloud architecture. Ensure a high-quality code base by writing and reviewing performance, well-tested code Demonstrated experience building complex products. Knowledge of Splunk or other alerting and monitoring solutions. Fluent in the use of Git, Jenkins. Broad understanding of Software Engineering Concepts and Methodologies is required.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 10 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 10 The Team: Join the TeraHelix team within S&P Globals Enterprise Data Organisation (EDO). We are a dynamic group of highly skilled engineers dedicated to building innovative data solutions that empower businesses. Our team works collaboratively on foundational data products, leveraging cutting-edge technologies to solve real-world client challenges. The Impact: As part of the TeraHelix team, you will contribute to the development of our marquee AI-enabled data products, including TeraHelix's GearBox, ETL Mapper and Data Studio solutions. Your work will directly impact our clients by enhancing their data capabilities and driving significant business value. Whats in it for you: Opportunity to work on a distributed, cloud-native, fully Java tech stack (Java 21+) with UI components built in the Vaadin framework. Engage in skill-building and innovation opportunities in a supportive environment. Collaborate with a diverse group of professionals across data, product, and technology disciplines. Contribute to projects that have a tangible impact on the organisation and the industry. Key Responsibilities: Design, develop and maintain robust data pipelines to support data ingestion, transformation and storage. Write efficient SQL queries for data extraction, manipulation and analysis. Utilise Apache Spark & Python for data processing, automation and integration with various data sources. Collaborate with data scientists and stakeholders to understand data requirements and deliver actionable insights. Implement data quality checks and validation processes to ensure data accuracy and reliability. Analyse large datasets to identify trends, patterns and anomalies that inform business decisions. Create and maintain documentation for data processes, workflows and architecture. Stay updated on industry best practices and emerging technologies in data engineering and analysis. Provide support using data visualisation tools to help stakeholders interpret data effectively. What were looking for: Bachelors degree or higher in Computer Science or a related field. Strong experience in SQL for data manipulation and analysis. Proficiency in Spark (Java, SQL or PySpark) and Python for data processing and automation tasks. Solid understanding of data engineering principles and best practices. Experience with data analytics and the ability to derive insights from complex datasets. Familiarity with big data technologies (e.g. Hadoop, Spark) and cloud data platforms (e.g. AWS, Azure, GCP). Familiarity with data visualisation tools (e.g. Power BI, Tableau, Qlik) and Data Science Notebooks (e.g. Jupyter, Apache Zeppelin) to present findings effectively. Knowledge of financial or capital markets to understand business domain requirements. Excellent problem-solving skills and attention to detail. Strong communication skills for collaboration with cross-functional teams. Nice to have: Experience with Java for data processing or integration tasks. Knowledge of ETL (Extract, Transform, Load) processes and tools. Understanding of data warehousing concepts and architecture. Experience with version control systems (e.g. Git, GitHub, Bitbucket, Azure DevOps). Interest in machine learning and data science concepts. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

12.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title Senior Software EngineerExperience 12-20 YearsLocation Bangalore : Strong knowledge & hands-on experience in AWS Data Bricks Nice to have Worked in hp eco system (FDL architecture) Technically strong to help the team on any technical issues they face during the execution. Owns the end-to-end technical deliverablesHands on data bricks + SQL knowledge Experience in AWS S3, Redshift, EC2 and Lambda services Extensive experience in developing and deploying Bigdata pipelines Experience in Azure data lake Strong hands on in SQL development / Azure SQL and in-depth understanding of optimization and tuning techniques in SQL with Redshift Development in Notebooks (like Jupyter, DataBricks, Zeppelin etc) Development experience in Spark Experience in scripting language like python and any other programming language Roles and Responsibilities Candidate must have hands on experience in AWS Data Databricks Good development experience using Python/Scala, Spark SQL and Data Frames Hands-on with Databricks, Data Lake and SQL knowledge is a must. Performance tuning, troubleshooting, and debugging SparkTM Process Skills: Agile Scrum Qualification: Bachelor of Engineering (Computer background preferred)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Pune, Bengaluru

Work from Office

Position: Sr. Data Analyst Experience: 5 - 9 years Location: Pune / Bangalore Job Description Summary: Key Skills : Strong SQL, Python, Pyspark, Jupyter Notebook, Agile / Scrum / Jira / Confluence, Microsoft Excel, Exposure to any cloud (GCP / AWS / Azure) would be a plus Job Description: Must-Have: 5 - 9 years of professional experience as a Data Analyst with good decision-making, analytical and problem-solving skills. Working knowledge / experience of Big Data frameworks like Hadoop, Hive and Spark. Hands-on experience in query languages like HQL or SQL (Spark SQL) for Data exploration. Data mapping: Determine the data mapping required to join multiple data sets together across multiple sources. Documentation - Data Mapping, Subsystem Design, Technical Design, Business Requirements. Exposure to Logical to Physical Mapping, Data Processing Flow to measure the consistency, etc. Data Asset design / build: Working with the data model / asset generation team to identify critical data elements and determine the mapping for reusable data assets. Understanding of ER Diagram and Data Modeling concepts Exposure to Data quality validation Exposure to Data Management, Data Cleaning and Data Preparation Exposure to Data Schema analysis. Exposure to working in Agile framework. SQL, Pyspark, Python with Banking Domain knowledge / Credit & Lending domain knowledge. Knowledge of Credit Risk Frameworks such as Basel II, III, IFRS 9 and Stress Testing and understanding their drivers - advantageous Retail Credit / Traded Credit knowledge - applications will be considered Good To Have: BFSI Domain knowledge Data Visualization - Tableau or Qlik Sense Exposure to Hadoop, Hive and ETL. Working knowledge of any cloud services like AWS or GCP or Azure. Any relevant certifications would be a plus. Role & Responsibilities: Take complete responsibility for the sprint stories' execution. Understand the business requirements from the product/project stakeholders and break the requirements into simpler stories and tasks and do the necessary mapping of the tasks to the logical model of the solutions. Mapping of business entities to technical attributes with the logic for transformation defined clearly. Be accountable for the delivery of the tasks in the defined timelines with good quality. Follow the processes for project execution and delivery. Follow agile methodology. Working with the team leads closely and contribute to the smooth delivery of the project. Understand/define the architecture and discuss the pros-cons of the same with the team. Involve in the brainstorming sessions and suggest improvements in the architecture/design. Working with other teams leads to getting the architecture/design reviewed. Keep all the stakeholders updated about the project, task status, risks, and issues if any.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Bitkraft Technologies LLP is seeking a skilled React Native Developer to join their software engineering team. In this role, you will primarily focus on utilizing the ReactJS framework to develop web projects for the custom services business of the company. As a React Native Developer at Bitkraft, you are expected to have a strong command over various javascript frontend frameworks and third-party libraries. A keen eye for quality visual design and a genuine passion for ensuring an exceptional user experience will be highly advantageous. If you are someone who enjoys tackling challenges, thrives in a collaborative team environment, and is keen on working in a dynamic setting that presents both technical and business-related hurdles, we would love to connect with you. **Essential Skills:** - Proficiency in the following technologies at an intermediate to expert level: - JavaScript, HTML5, and CSS - React Native **Other Essential Skills/Requirements:** - Meticulous attention to detail - Experience working in Agile projects, with familiarity in tools like Jira - Willingness to adapt to new technologies, platforms, and frameworks for upcoming projects - Strong work ethic and commitment to meeting deadlines and supporting team members - Flexibility to collaborate across different time zones with international clients when necessary **Desirable Skills:** - Frontend Knowledge in at least one of the following: - Angular - Vuejs - Backend Knowledge in at least one of the following: - NodeJs - Python (Django/Flask/Jupyter Notebook) - PHP (Yii2/Wordpress/Magento) - Databases Knowledge in at least one of the following: - MySQL - Postgresql - MongoDB - Graph Databases - Oracle - Cloud Infrastructure Knowledge in at least one of the following: - AWS - Azure - Google Cloud / Firebase - Mobile Technologies Knowledge in at least one of the following: - Native Android - Native iOS - Hybrid Mobile App Development - Ionic/Flutter/React Native **Key Responsibilities:** - Understanding and interpreting client requirements to provide technical solutions - Solving business problems through innovative technological approaches - Timely delivery of work with adherence to quality standards - Fostering the development of reusable code - Addressing technical queries from clients, management, and team members - Assessing existing technical architecture and suggesting enhancements - Collaborating with development teams and product managers to conceptualize software solutions - Designing client-side and server-side architecture - Conducting unit testing and scenario-based testing for system robustness - Troubleshooting, debugging, and updating existing applications - Implementing security and data protection measures for application security - Generating technical documentation, flow diagrams, use-case diagrams, and charts as needed - Effective communication with team members and clients for streamlined project operations - Staying updated on the latest advancements in web technologies and related programming languages/frameworks **Experience:** 5 to 8 years **Job Location:** Fort, Mumbai **Why Join Bitkraft:** - Your inputs and opinions are highly valued - Exposure to cutting-edge technologies - Direct collaboration with client teams - International project exposure - Insight into the overall project scope - Fast-paced environment with quick project turnovers - Autonomy to manage your time efficiently - Enjoyable and supportive work environment Bitkraft Technologies LLP: Bitkraft Technologies LLP is an award-winning Software Engineering Consultancy that specializes in Enterprise Software Solutions, Mobile Apps Development, ML/AI Solution Engineering, Extended Reality, Managed Cloud Services, and Technology Skill-sourcing, with a remarkable track record. Driven by technology, the team at Bitkraft pushes boundaries to meet the business requirements of their clients. They are committed to delivering products of the highest quality standards and take pride in developing robust user-centric solutions that cater to business needs. Bitkraft serves clients across over 10+ countries, including the US, UK, UAE, Oman, Australia, and India, among others.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 Lacs

Hyderabad

Work from Office

Dear Candidate, We are seeking a highly skilled and motivated Software Engineer with expertise in Azure AI , Cognitive Services, Machine Learning , and IoT. The ideal candidate will design, develop, and deploy intelligent applications leveraging Azure cloud technologies, AI-driven solutions, and IoT infrastructure to drive business innovation and efficiency. Responsibilities: Develop and implement AI-driven applications using Azure AI and Cognitive Services . Design and deploy machine learning models to enhance automation and decision-making processes. Integrate IoT solutions with cloud platforms to enable real-time data processing and analytics. Collaborate with cross-functional teams to architect scalable, secure, and high-performance solutions. Optimize and fine-tune AI models for accuracy, performance, and cost-effectiveness. Ensure best practices in cloud security, data governance, and compliance. Monitor, maintain, and troubleshoot AI and IoT solutions in production environments. Stay updated with the latest advancements in AI, ML, and IoT technologies to drive innovation. Required Skills and Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related field. Strong experience with Azure AI, Cognitive Services, and Machine Learning. Proficiency in IoT architecture, data ingestion, and processing using Azure IoT Hub , Edge , or related services. Expertise in deploying and managing machine learning models in cloud environments. Strong understanding of RESTful APIs, microservices, and cloud-native application development. Experience with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes). Knowledge of cloud security principles and best practices. Excellent problem-solving skills and the ability to work in an agile development environment. Preferred Qualifications: Certifications in Microsoft Azure AI, IoT, or related cloud technologies. Experience with Natural Language Processing (NLP) and Computer Vision. Familiarity with big data processing and analytics tools such as Azure Data. Prior experience in deploying edge computing solutions. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Bitkraft Technologies LLP is seeking Technical Leads to join the software engineering team, working on cutting-edge web development projects for the custom services business. As a Technical Lead, you will be proficient in full stack technologies, frameworks, third party libraries, cloud systems, and management tools. Managing a small to medium-sized development team for executing, reviewing, and deploying web and mobile projects is a critical aspect of the role. Passion for technology and ensuring timely project delivery with the highest quality is essential. If you enjoy problem-solving, are a team player, and thrive in a fast-paced environment with technical and business challenges, we are interested in meeting you. **Essential Skills:** **Frontend:** - Intermediate to expert level experience with JavaScript, HTML5, CSS - Angular, Vuejs, React **Backend:** - Intermediate to expert level experience with NodeJs, Python (Django/Flask/Jupyter Notebook) - Intermediate to expert level experience with PHP (MySQL, Postgresql, MongoDB, Graph Databases, Oracle) **Cloud Infrastructure:** - Intermediate to expert level experience with AWS, Azure, Google Cloud / Firebase **Other Essential Skills/Requirements:** - Great attention to detail - Team management, progress tracking - Code reviews - Collaboration with project management team - Agile project experience, familiarity with tools like Jira - Adaptability to work on various technologies, platforms, and frameworks for future projects - Strong work ethic, meeting deadlines, and supporting team members - Flexibility to work across time zones with overseas customers **Desirable Skills:** **Mobile Technologies:** - Knowledge of Native Android, Native iOS, Hybrid Mobile App Development (Ionic/Flutter/React Native) **Key Responsibilities:** - Understanding client requirements and proposing technical solutions - Problem-solving through technology - Timely and quality delivery of work - Code reusability - Addressing technical queries - Evaluating and enhancing technical architecture - Collaborating with development teams and product managers - Designing client-side and server-side architecture - Building responsive UX/UI - Developing and managing databases and applications - Writing REST APIs - Conducting testing for robust systems - Troubleshooting, debugging, and updating applications - Implementing security and data protection measures - Creating technical documentation and diagrams - Effective communication with team members and clients - Staying updated on latest web technologies and programming languages/frameworks **Experience:** 5 to 8 years **Job Location:** Fort, Mumbai **Why join Bitkraft ** - Valued inputs and opinions - Exposure to latest technologies - Direct collaboration with client teams - International project exposure - Comprehensive project understanding - Quick project completions in a fast-paced environment - Efficient time management - Friendly and supportive work culture,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Pune

Hybrid

So, what’s the role all about? We are seeking a skilled Senior Data Engineer to join our Actimize Watch Data Analytics team. In this role, you will collaborate closely with the Data Science team, Business Analysts, SMEs to monitor and optimize the performance of machine learning models. You will be responsible for running various analytics on data stored in S3, using advanced Python techniques, generating performance reports & visualization in Excel, and showcasing model performance & stability metrics through BI tools such as Power BI and Quick Sight. How will you make an impact? Data Integration and Management: Design, develop, and maintain robust Python scripts to support analytics and machine learning model monitoring. Ensure data integrity and quality across various data sources, primarily focusing on data stored in AWS S3. Check the data integrity & correctness of various new customers getting onboarded to Actimize Watch Analytics and Reporting: Work closely with Data Scientists, BAs & SMEs to understand model requirements and monitoring needs. Perform complex data analysis as well as visualization using Jupyter Notebooks, leveraging advanced Python libraries and techniques. Generate comprehensive model performance & stability reports, showcase them in BI tools. Standardize diverse analytics processes through automation and innovative approaches. Model Performance Monitoring: Implement monitoring solutions to track the performance and drift of machine learning models in production for various clients. Analyze model performance over time and identify potential issues or areas for improvement. Develop automated alerts and dashboards to provide real-time insights into model health. Business Intelligence and Visualization: Create and maintain dashboards in BI tools like Tableau, Power BI and QuickSight to visualize model performance metrics. Collaborate with stakeholders to ensure the dashboards meet business needs and provide actionable insights. Continuously improve visualization techniques to enhance the clarity and usability of the reports. Collaboration and Communication: Work closely with cross-functional teams, including Data Scientists, Product Managers, Business Analysts and SMEs to understand requirements and deliver solutions. Communicate findings and insights effectively to both technical and non-technical stakeholders. Provide support and training to team members on data engineering and analytics best practices and tools. Have you got what it takes? 5 to 7 years of experience in data engineering, with a focus on analytics, data science and machine learning model monitoring. Proficiency in Python and experience with Jupyter Notebooks for data analysis. Strong experience with AWS services, particularly S3 and related data processing tools. Expertise in Excel for reporting and data manipulation. Hands-on experience with BI tools such as Tableau, Power BI and QuickSight. Solid understanding of machine learning concepts and model performance metrics. Strong Python & SQL skills for querying and manipulating large datasets. Excellent problem-solving and analytical skills. Ability to work in a fast-paced, collaborative environment. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Preferred Qualifications: Experience with other AWS services like S3, Glue as well as BI tools like QuickSight & PowerBI Familiarity with CI/CD pipelines and automation tools. Knowledge of data governance and security best practices. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7900 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 weeks ago

Apply

2.0 - 4.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Your Impact: As a Python Developer in the Debricked data science team, you will work on enhancing data intake processes and optimizing data pipelines. You will apply many different approaches, depending on the needs of the product and the challenges you encounter. In some cases, we use AI/LLM techniques, and we expect the number of such cases to increase. Your contributions will directly impact Debrickeds scope and quality and will help ensure future commercial growth of the product. What the role offers: As a Python Developer, you will: Innovative Data Solutions: Develop and optimize data pipelines that improve the efficiency, accuracy, and automation of the Debricked SCA tools data intake processes. Collaborative Environment: Work closely with engineers and product managers from Sweden and India to create impactful, data-driven solutions. Continuous Improvement: Play an essential role in maintaining and improving the data quality that powers Debrickeds analysis, improving the products competitiveness. Skill Development: Collaborate across teams and leverage OpenTexts resources (including an educational budget) to develop your expertise in software engineering,data science and AI, expanding your skill set in both traditional and cutting-edge technologies. What you need to Succeed: 2-4 years of experience in Python development, with a focus on optimizing data processes and improving data quality. Proficiency in Python and related tools and libraries like Jupyter, Pandas and Numpy. A degree in Computer Science or a related discipline. An interest in application security. Asset to have skills in Go, Java, LLMs (specifically Gemini), GCP, Kubernetes, MySQL, Elastic, Neo4J. A strong understanding of how to manage and improve data quality in automated systems and pipelines. Ability to address complex data challenges and develop solutions to optimize systems. Comfortable working in a distributed team, collaborating across different time zones.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

Mumbai

Work from Office

The squad of IT APS Customers, CS&E Robotics, Data and AI Platform Support is looking for a Ops Engineer with a proven track record deploying python applications and/or data science models. You should be specialized or have a strong interest in Kubernetes, Ansible, Terraform, Linux system administration and Python deployment. As an Ops Engineer within the Tribe APS, you will have the opportunity to work at the centre of the banks initiatives in generative AI, smart automation and traditional machine learning. You will work to maintain both the workbench environment of the data scientists as well as monitor and deploy their models and applications. . Responsibilities As part of your responsibilities, you will have to work on the following: - Evaluate sizing and infrastructure requirements for new use cases. - Setup self-service deployment pipelines for AI applications. - Ensure reproducibility of deployments in both environment Non-Prod and Production. - Make sure that all applications are properly monitored, and alerting is in place. - Evolve in an environment where innovation and lean processes are praised, straight-forward communication is encouraged, and peers understand the meaning of team up. - Work with a team of colleagues who are ready to collaborate and to share their experience. Technical & Behavioral Competencies Mandatory: Knowledge of Python ecosystem. Experience with http rest APIs with focus on Django Experience with Git (version control system). E.g. Gitlab, Gitlab CI Experience in DevOps /OPS Linux operating system experience Experience in containerization (docker, podman) LLM operations Cloud experience (e.g. IBM Cloud / Azure) Preferable: Kubernetes/Helm Familiar with code quality gating (Sonar, Nexus, Fortify) Ansible Domino Datalab, Jupyter Artifactory Kafka SQL Postgres Terraform Dynatrace Business Experience: Knowledge of the financial services industry. Specific Qualifications (if required) Agile environment. Follows the Customer processes for projects, incident and change management. Being standalone and team worker, analytical minded, meet commitment, ability to work in a dynamic and multi-cultural environment, flexible, customer-oriented, understand risk awareness. Motivated self-starter, process-oriented with high attention to detail Quick self-starter, pro-active attitude. Good communication skills, Good analytical and synthesis skills. Autonomy, commitment and perseverance. Ability to work in a dynamic and multicultural environment. Flexibility (in peak periods extra efforts may be required). Open minded and show flexibility in self-learning new technologies/tools. You are customer minded and can translate technical issues into non-technical explanations. You are always conscious about continuity of services. You have a very good team spirit and share your knowledge and experience with other members of the team. Working in collaboration with team. Client-oriented, analytical, initiative oriented and able to work independently. Be flexible and ready to provide support outside of Business hours (on-call). Able to take additional responsibility. Able to work from base location Mumbai (Whichever is your base location) during hybrid model. You are flexible and ready to provide support outside of Business hours (on-call). Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Ability to deliver / Results driven Ability to share / pass on knowledge Client focused Transversal Skills: Ability to understand, explain and support change Ability to develop and adapt a process Ability to develop others & improve their skills Analytical Ability Ability to inspire others & generate people's commitment Education Level: Bachelor Degree or equivalent

Posted 3 weeks ago

Apply

0.0 - 6.0 years

4 - 6 Lacs

Nagpur

Work from Office

Responsibilities: Implement computer vision applications using OpenCV & object detection techniques, segmentation , Train and fine tune models Develop machine learning models with Python, Docker, AWS SageMaker & IAM. Work from home

Posted 3 weeks ago

Apply

2.0 - 4.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Principle Developer - ML/Prompt Engineer Technologies: Amazon Bedrock, RAG Models, Java, Python, C or C++, AWS Lambda, Responsibilities: Responsible for developing, deploying, and maintaining a Retrieval Augmented Generation (RAG) model in Amazon Bedrock, our cloud-based platform for building and scaling generative AI applications. Design and implement a RAG model that can generate natural language responses, commands, and actions based on user queries and context, using the Anthropic Claude model as the backbone. Integrate the RAG model with Amazon Bedrock, our platform that offers a choice of high-performing foundation models from leading AI companies and Amazon via a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. Optimize the RAG model for performance, scalability, and reliability, using best practices and robust engineering methodologies. Design, test, and optimize prompts to improve performance, accuracy, and alignment of large language models across diverse use cases. Develop and maintain reusable prompt templates, chains, and libraries to support scalable and consistent GenAI applications. Skills/Qualifications: Experience in programming with at least one software language, such as Java, Python, or C/C++. Experience in working with generative AI tools, models, and frameworks, such as Anthropic, OpenAI, Hugging Face, TensorFlow, PyTorch, or Jupyter. Experience in working with RAG models or similar architectures, such as RAG, Ragna, or Pinecone. Experience in working with Amazon Bedrock or similar platforms, such as AWS Lambda, Amazon SageMaker, or Amazon Comprehend. Ability to design, iterate, and optimize prompts for various LLM use cases (e.g., summarization, classification, translation, Q&A, and agent workflows). Deep understanding of prompt engineering techniques (zero-shot, few-shot, chain-of-thought, etc.) and their effect on model behavior. Familiarity with prompt evaluation strategies, including manual review, automatic metrics, and A/B testing frameworks. Experience building prompt libraries, reusable templates, and structured prompt workflows for scalable GenAI applications. Ability to debug and refine prompts to improve accuracy, safety, and alignment with business objectives. Awareness of prompt injection risks and experience implementing mitigation strategies. Familiarity with prompt tuning, parameter-efficient fine-tuning (PEFT), and prompt chaining methods. Familiarity with continuous deployment and DevOps tools preferred. Experience with Git preferred Experience working in agile/scrum environments Successful track record interfacing and communicating effectively across cross-functional teams. Good communication, analytical and presentation skills, problem-solving skills and learning attitude

Posted 3 weeks ago

Apply

6.0 - 10.0 years

15 Lacs

Hyderabad

Work from Office

Notice Period - I mmediate to 30 days Develop clean and efficient code using Python.Work with tools like Git, Docker, Jupyter and CI/CD pipelines. Design and integrate RESTful APIs and third-party services. Perform debugging, performance tuning and code optimization. Write unit tests and ensure high code quality through version control systems. Contact Person - Christopher Email - christopher@gojobs.biz

Posted 3 weeks ago

Apply

1.0 years

4 - 5 Lacs

Bangalore, Karnataka, IN

On-site

About the job: Key responsibilities: 1. Extract, clean, and organize data from internal databases, APIs, and external sources using Python and SQL 2. Build and maintain automated web scraping routines for competitor tracking, lead generation, and market insights 3. Design dashboards and reports that help internal teams (Sales, Ops, Product) make informed decisions 4. Conduct exploratory data analysis (EDA) and generate actionable insights from structured and unstructured datasets 5. Collaborate closely with product and engineering teams to define data requirements and validation checks 6. Ensure data integrity, security, and accuracy across all datasets and pipelines 7. Identify trends, patterns, and anomalies to support internal reporting and product roadmap Requirements: 1. Strong command of SQL (PostgreSQL, MySQL, or BigQuery), including joins, aggregations, and nested queries 2. Proficiency in Python for data manipulation and automation using Pandas, NumPy, Regex 3. Experience with web scraping tools like BeautifulSoup, Selenium, or Scrapy 4. Hands-on experience with data extraction from REST APIs and web sources 5. Ability to handle large datasets and perform exploratory and statistical analysis 6. Familiarity with version control tools like Git or GitHub 7. Clear written and verbal communication skills 8. Experience with BI tools like Power BI, Google Data Studio, or Tableau 9. Exposure to Google Cloud Platform (GCP) tools such as BigQuery and Cloud Functions 10. Understanding of data privacy and compliance regulations, especially in B2B SaaS and EdTech sectors 11. Prior experience in CRM, education, or SaaS analytics Additional Information: 1. Role involves close collaboration across teams, including Sales, Ops, Product, and Engineering 2. Emphasis on data security, automation, and business impact through insights and reporting Who can apply: Only those candidates can apply who: have minimum 1 years of experience are from Bangalore only Salary: ₹ 4,00,000 - 5,00,000 /year Experience: 1 year(s) Deadline: 2025-08-06 23:59:59 Other perks: Free snacks & beverages, Health Insurance, Life Insurance Skills required: Python, Presentation skills, SQL, Data Extraction, Google Colab and Jupyter Notebook About Company: Our company offers a range of services aimed at empowering businesses and individuals. We provide upskilling courses designed to help students enhance their technical and professional skills, preparing them for success in the modern workforce. Additionally, we specialize in website building services, creating customized, responsive, and user-friendly websites to help businesses establish a strong online presence. We also offer CRM software solutions that help organizations manage customer relationships efficiently, streamline operations, and improve overall customer satisfaction. With a focus on quality and innovation, we are committed to delivering effective, scalable, and impactful solutions to meet the unique needs of each client.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 3 Lacs

Chennai

Work from Office

WhatsApp at 8076971094 Should have an experience ranging from 0-3 years of strong technical and Analytical Skills Strong proficiency in Python with a solid understanding of OOP principles. Experience with one or more blockchain frameworks (Solidity, Web3.py, Hardhat, Truffle) Research knowledge of IoT protocols and edge platforms Understanding of network security, encryption standards, etc. Proficiency in machine learning frameworks like Scikit-learn, TensorFlow, or PyTorch. Familiarity with deep learning architectures: CNNs, RNNs, Transformers. Experience working with or fine-tuning LLMs (e.g., GPT, LLaMA, Falcon).

Posted 4 weeks ago

Apply

1.0 - 4.0 years

9 - 13 Lacs

Coimbatore

Work from Office

Overview Operations Management: Understand the SOP for data sourcing from internal stakeholders and setup process to track quality Do Independent process study and define the workflow , quality checks to ensure the accuracy of the data and partner with technology to improve the Quality assurance process Define and implement the appropriate Metrics for reporting to stakeholder on a regular basis (weekly/monthly). Monitor the team of Data operators who collect the data from various public sources such as company Website, Company’s Annual reports, financial reports as per SOP Responsible for the quality of data collected by Data operators from Company annual statements, financial reports, Stock Exchanges. Perform RCA of the errors occurred and have necessary controls to avoid repetition. Independently manage the process includes daily deliverables, stakeholders’ management, proactive approach, and process improvement initiatives. People Management: Focus on training pool of Juniors, and setting up of team workflows , process to detect data quality issues, metrics and KPIs. Consistent touch on Team members progress and helping team to acquire right level of trainings and skills. Develop Training plan and module that helps training new joiners and backfills. Delivery: Deliver quality assured data (financial/non-financial) as per agreed daily SLAs Work with internal stakeholders and downstream teams on meeting Service Level Agreements (SLA) on data Quality Assurance (accuracy, completeness, and timeliness). Create and implement the appropriate metrics on deliverables to the stakeholders and always ensure the agreed SLA is met. Process Improvement: Lead team Quality improvements by analyzing existing process and focus on areas for improvement, automation applying Lean Six Sigma, Balanced Scorecard, and other TQM approaches. Focused on process improvements including elimination of redundancies, brainstorming the process improvements ideas and partner with the engineering teams to improvise the current state of work. Responsibilities Should have hands on experience on creating new process metrics, KPI and scorecard. Can independently manage the deliverables and stakeholders’ expectations. Individuals with an analytical mindset who can analyze the data with strong attention to details to identify problems, trends/patterns, anomalies, issues within data and solve the issues. Familiarity with process and quality improvements Work schedules is in India time zone could extend into EMEA hours for meetings Adaptability and flexibility to work in a fast paced and changing environment Ability to work within a team-oriented environment across hierarchies, functions and geographies Ability to communicate with various internal and external parties globally. Intent in automation and carry supporting skills like advanced excel knowledge , building dashboard using PowerBI , and automation through Jupyter/Python notebooks etc. Have high interests to learn new things and willingness to take and deliver challenging work. Good to have: Have sound knowledge on capital markets and functioning of the markets (Exposure to various financial products like Equities, Fixed Income, Derivatives, Commodities, Corporate actions etc.) Knowledge about Financial Statements like Annual reports, Income statement, Balance Sheet, and Cash flow statement Qualifications 5-7 years of full-time professional experience in: Experience in a global financial institution or shared service firm. Research background a plus (Economic, Environmental, Industry or Sector) Experience in Annual Report analysis and reporting a big plus. Master’s Degree in Business, Environmental Science, Public Health or Climate Management a big plus. Have sound technical knowledge who can handle the data analysis and ability to learn the financial product and Finance statements. Experience in data quality and automation related roles, Business analysis , analyzing existing process and reengineer to achieve efficiency and improved quality. Good to have people management experience in operations, back-office, KPO set-up. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 4 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Flask Developer to create lightweight and high-performance web services using Python. Key Responsibilities: Develop web APIs using Flask and deploy them on cloud or containers Use SQLAlchemy or MongoEngine for data access Write modular blueprints and configure middleware Perform request validation and error handling Work on REST/GraphQL integration with frontend teams Required Skills & Qualifications: Expertise in Flask , Python , and Jinja2 Familiar with Gunicorn , Docker , and PostgreSQL Understanding of JWT , OAuth , and API security Bonus: Experience in FastAPI or Flask-SocketIO Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 4 weeks ago

Apply

6.0 - 11.0 years

30 - 45 Lacs

Kochi

Remote

Job Summary: We are looking for a highly skilled Senior Data Scientist to join our India-based team in a remote capacity. This role focuses on building and deploying advanced predictive models to influence key business decisions. The ideal candidate should have strong experience in machine learning, data engineering, and working in cloud environments, particularly with AWS. You'll be collaborating closely with cross-functional teams to design, develop, and deploy cutting-edge ML models using tools like SageMaker, Bedrock, PyTorch, TensorFlow, Jupyter Notebooks, and AWS Glue. This is a fantastic opportunity to work on impactful AI/ML solutions within a dynamic and innovative team. Key Responsibilities: Predictive Modeling & Machine Learning • Develop and deploy machine learning models for forecasting, optimization, and predictive analytics. • Use tools such as AWS SageMaker, Bedrock, LLMs, TensorFlow, and PyTorch for model training and deployment. • Perform model validation, tuning, and performance monitoring. • Deliver actionable insights from complex datasets to support strategic decision-making. Data Engineering & Cloud Computing • Design scalable and secure ETL pipelines using AWS Glue. • Manage and optimize data infrastructure in the AWS environment. • Ensure high data integrity and availability across the pipeline. • Integrate AWS services to support the end-to-end machine learning lifecycle. Python Programming • Write efficient, reusable Python code for data processing and model development. • Work with libraries like pandas, scikit-learn, TensorFlow, and PyTorch. • Maintain documentation and ensure best coding practices. Collaboration & Communication • Work with engineering, analytics, and business teams to understand and solve business challenges. • Present complex models and insights to both technical and non-technical stakeholders. • Participate in sprint planning, stand-ups, and reviews in an Agile setup. Preferred Experience (Nice to Have): • Experience with applications in the utility industry (e.g., demand forecasting, asset optimization). • Exposure to Generative AI technologies. • Familiarity with geospatial data and GIS tools for predictive analytics. Qualifications: • Masters or Ph.D. in Computer Science, Statistics, Mathematics, or a related field. • 5+ years of relevant experience in data science, predictive modeling, and machine learning. • Experience working in cloud-based data science environments (AWS preferred).

Posted 4 weeks ago

Apply

7.0 - 11.0 years

45 - 60 Lacs

Bengaluru

Remote

About the Role: The charter of the Data + ML Platform team is to harness all the data that is ingested and cataloged within the Data LakeHouse for exploration, insights, model development, ML Engineering and Insights Activation. This team is situated within the larger Data Platform group, which serves as one of the core pillars of our company. We process data at a truly immense scale. Our processing is composed of various facets including threat events collected via telemetry data, associated metadata, along with IT asset information, contextual information about threat exposure based on additional processing, etc. These facets comprise the overall data platform, which is currently over 200 PB and maintained in a hyper scale Data Lakehouse, built and owned by the Data Platform team. The ingestion mechanisms include both batch and near real-time streams that form the core Threat Analytics Platform used for insights, threat hunting, incident investigations and more. As an engineer in this team, you will play an integral role as we build out our ML Experimentation Platform from the ground up. You will collaborate closely with Data Platform Software Engineers, Data Scientists & Threat Analysts to design, implement, and maintain scalable ML pipelines that will be used for Data Preparation, Cataloging, Feature Engineering, Model Training, and Model Serving that influence critical business decisions. Youll be a key contributor in a production-focused culture that bridges the gap between model development and operational success. Future plans include generative AI investments for use cases such as modeling attack paths for IT assets. What Youll Do: Help design, build, and facilitate adoption of a modern Data+ML platform Modularize complex ML code into standardized and repeatable components Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines Review code changes from data scientists and champion software development best practices Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment What Youll Need: B.S. in Computer Science, Data Science, Statistics, Applied Mathematics, or a related field and 7 + years related experience; or M.S. with 5+ years of experience; or Ph.D with 6+ years of experience. 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc. Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable. Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.) Production experience with infrastructure-as-code tools such as Terraform, FluxCD Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools Expert level experience with CI/CD frameworks such as GitHub Actions Expert level experience with containerization frameworks Strong analytical and problem solving skills, capable of working in a dynamic environment Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes. Experience with the Following is Desirable: Go Iceberg Pinot or other time-series/ OLAP-style database Jenkins Parquet Protocol Buffers/GRPC.

Posted 4 weeks ago

Apply

1.0 - 4.0 years

9 - 13 Lacs

Mumbai

Work from Office

Overview MSCI Data Collection team's sources ESG and Climate raw input data disclosed in publicly available sources. On a day-today basis, the team manages data production, coordinates with vendors, product teams, clients, and corporates. The team is responsible for quality review and ensures that data collected is up to date and adheres to data collection guidance and methodology defined by MSCI. Moreover, with the team’s consistent drive to innovate and leverage technology, the team initiates and/or collaborates with other teams on programs related to data quality and process improvements, through leveraging GenAI/LLMs, automation, workflow streamlining projects, and building data QA models. Responsibilities Searching for, procuring, and processing information about global companies on an as needed basis using multiple methods, techniques, and sources; Working different types of information sources (i.e. annual reports, websites, quantitative data feeds, web crawlers, news articles), types of information (i.e. quantitative data, key performance indicators, prose narratives), topics or subject matters (i.e. traditional financial risk, corporate governance, and sustainability) and types of analysis (i.e. rules-based, subjective analysis, written summaries); Leverage Generative AI tools/LLMs to improve existing data sourcing process & QA process in production; Conduct an efficient data quality assurance (QA) process including assessing results, compiling, and reporting QA findings; Work with internal stakeholders and downstream teams on understanding data requirement, data QC scope and data delivery; Contribute to working committees, projects, or perform other tasks as deemed necessary by the business; and Reviewing and processing feedback from companies and responding to client queries. Qualifications at least a Bachelor's degree in the fields of Finance/Economics or Business Management, International Relations, Social Science, Environmental Science or Interdisciplinary Studies. Fresh graduates are welcome to apply. (0-3yrs) Interest and drive for adoption of Generative AI/LLMs into production workflow to improve timeliness and quality of data. Experience of using GenAI tools will be added advantage. Excellent oral and written communication skills in English Proficiency in creating presentations, data analysis and excellent research and analytical skills Good to have: Good command over excel tools and functionality dealing with volume of data and exposure to pandas or tools like Power BI and jupyter notebooks Comfortable working in a team environment across hierarchies, functions, and geographies Strong interpersonal skills and ability to work with people in different offices and time zones Interest and drive for adoption of Generative AI/LLMs into production workflow to improve timeliness and quality of data. Experience of using GenAI tools will be added advantage. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer committed to diversifying its workforce. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies