Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You are a highly skilled, detail-oriented, and motivated Python DQ Automation Developer who will be responsible for designing, developing, and maintaining data quality automation solutions using Python. With a deep understanding of data quality principles, proficiency in Python, and experience in data processing and analysis, you will play a crucial role in ensuring accurate and timely data integration and transformation. Your key responsibilities will include designing, developing, and implementing data quality automation processes and solutions to identify, measure, and improve data quality. You will write and optimize Python scripts using libraries such as Pandas, NumPy, and PySpark for data manipulation and processing. Additionally, you will develop and enhance ETL processes, analyze data sets to identify data quality issues, and develop and execute test plans to validate the effectiveness of data quality solutions. As a part of the team, you will maintain comprehensive documentation of data quality processes, procedures, and standards, and collaborate closely with data analysts, data engineers, DQ testers, and other stakeholders to understand data requirements and deliver high-quality data solutions. Required Skills: - Proficiency in Python and related libraries (Pandas, NumPy, PySpark, pyTest). - Experience with data quality tools and frameworks. - Strong understanding of ETL processes and data integration. - Familiarity with data governance and data management principles. - Excellent analytical and problem-solving skills with a keen attention to detail. - Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. - Ability to work effectively both independently and as part of a team. Qualifications: - Bachelor's degree in computer science or Information Technology. An advanced degree is a plus. - Minimum of 7 years of experience in data quality automation and Python Development. - Proven experience with Python libraries for data processing and analysis. Citi is an equal opportunity and affirmative action employer, encouraging all qualified and interested applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability, please review Accessibility at Citi.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Senior Data Scientist with 5+ years of experience, you will play a crucial role in our team based in Indore/Pune. Your responsibilities will involve designing and implementing models, extracting insights from data, and interpreting complex data structures to facilitate business decision-making. You should have a strong background in Machine Learning areas such as Natural Language Processing, Machine Vision, Time Series, etc. Your expertise should extend to Model Tuning, Model Validation, Supervised and Unsupervised Learning. Additionally, hands-on experience with model development, data preparation, and deployment of models for training and inference is essential. Proficiency in descriptive and inferential statistics, hypothesis testing, and data analysis and exploration are key skills required for this role. You should be adept at developing code that enables reproducible data analysis. Familiarity with AWS services like Sagemaker, Lambda, Glue, Step Functions, and EC2 is expected. Knowledge of data science code development and deployment IDEs such as Databricks, Anaconda distribution, and similar tools is essential. You should also possess expertise in ML algorithms related to time-series, natural language processing, optimization, object detection, topic modeling, clustering, and regression analysis. Your skills should include proficiency in Hive/Impala, Spark, Python, Pandas, Keras, SKLearn, StatsModels, Tensorflow, and PyTorch. Experience with end-to-end model deployment and production for at least 1 year is required. Familiarity with Model Deployment in Azure ML platform, Anaconda Enterprise, or AWS Sagemaker is preferred. Basic knowledge of deep learning algorithms like MaskedCNN, YOLO, and visualization and analytics/reporting tools such as Power BI, Tableau, Alteryx would be advantageous for this role.,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
karnataka
On-site
As a ML Engineer/Data Scientist at Innova ESI in Bengaluru, you will be responsible for leveraging your strong background in Computer Science and Algorithms to work on pattern recognition, neural networks, algorithms, and statistical methods in order to deliver end-to-end data solutions. Your role will involve collaborating in cross-functional teams to drive digital transformation through innovative and sustainable IT solutions. You should have a deep understanding of machine learning and NLP, along with commercial experience in building and deploying LLM solutions. With 4-8 years of relevant experience, you are expected to demonstrate exceptional coding ability, particularly in Python, and be familiar with ML tools and libraries like TensorFlow, PyTorch, pandas, and scikit-learn. Your proficiency in Pattern Recognition and Neural Networks, combined with knowledge of Statistics and its application in data analysis, will be key in working with complex algorithms and data models. In addition, you should have a strong understanding of MLOps best practices and experience in deploying LLM applications to production at scale. As a problem solver and self-starter, you enjoy tackling difficult problems and finding optimal solutions independently as well as in a collaborative team environment. Your strong communication skills will enable you to effectively articulate complex technical concepts to non-technical stakeholders. This full-time hybrid role offers flexibility for some remote work while being primarily based in Bengaluru. If you meet the qualifications and are excited about driving digital transformation through innovative data solutions, we encourage you to share your resume with us at jaya.sharma@innovaesi.com.,
Posted 1 week ago
3.0 - 5.0 years
7 - 14 Lacs
Hyderabad
Work from Office
Role & responsibilities - Strong knowledge in OOPs and creating custom python packages for serverless applications. Strong Knowledge in SQL querying. Hands on experience in AWS services like Lambda, EC2, EMR, S3 and Athena, Batch, Textract and Comprehend. Strong Expertise in extracting text, tables, logos from low quality scanned multipage pdfs (80 - 150 dpi) and Images. Good Understanding in Probability and Statistics concepts and ability to find hidden patterns, relevant insights from the data. Knowledge in applying state of art NLP models like BERT, GPT - x, sciSpacy, Bidirectional LSTMs-CNN, RNN, AWS medical Comprehend for Clinical Named Entity Recognition (NER). Strong Leadership Skills. Deployment of custom trained and prebuilt NER models using AWS Sagemaker. Knowledge in setting up AWS Textract pipeline for large scale text processing using AWS SNS, AWS SQS, Lambda and EC2. Should have Intellectual curiosity to learn new things. ISMS responsibilities should be followed as per company policy. Preferred candidate profile - 3+ years Hands on in Python and data science like tools pandas, NumPy, SciPy, matplotlib and strong exposure in regular expressions. 3+ years Hands on experience in Machine learning algorithms like SVM, CART, Bagging and Boosting algorithms, NLP based ML algorithms and Text mining. Hands on expertise to integrate multiple data sources in a streamlined pipeline.
Posted 1 week ago
5.0 - 8.0 years
15 - 20 Lacs
Noida
Work from Office
Technical Expertise: Must Have Proficient in Java (must have ) . Hands-on experience with AWS AI services such as AWS Bedrock and Sagemaker. Good to have Python(good to have) programming with experience in libraries like TensorFlow, PyTorch, NumPy, and Pandas Experience with DevOps practices, including CI/CD pipelines, system monitoring, and troubleshooting in a production environment. Familiarity with other platforms like Google Gemini, and Copilot technologies. Soft Skills: Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across business and technical teams. Strong problem-solving and analytical skills. Ability to work with teams in a dynamic, fast-paced environment. Key Responsibilities: Design, develop, and implement AI and machine learning models using AWS AI services like AWS bedrock and Sagemaker. Fine-tune and optimize AI models for business use cases. Implement Generative AI solutions using AWS Bedrock and Java. Write efficient, clean, and maintainable Java/Python code for AI applications. Develop and deploy RESTful APIs using frameworks like Flask or Django for model integration and consumption. Experience: 5 to 8 years of experience in software development, with 3+ years in AI/ML or Generative AI projects. Demonstrated experience in deploying and managing AI applications in production environments. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Data Science and Machine Learning - Data Science and Machine Learning - Gen AI Data Science and Machine Learning - Data Science and Machine Learning - Python Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Middleware - API Middleware - Microservices
Posted 1 week ago
1.0 - 4.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a skilled Python Developer with expertise in Django to join our team at NextGen Web Services. The ideal candidate will have 1 to 4 years of experience and be available to work remotely. Roles and Responsibility Design, develop, and test software applications using Python and Django. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop high-quality, scalable, and efficient code. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with industry trends and emerging technologies. Job Requirements Proficiency in Python programming language. Experience with Django framework. Strong understanding of software development principles and methodologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Additional Info The company offers a dynamic and supportive work environment, with opportunities for professional growth and development.
Posted 1 week ago
3.0 - 5.0 years
10 - 14 Lacs
Noida
Work from Office
We are looking for a highly skilled AI & ML Engineer with 3 to 5 years of experience to join our team at Stanra Tech Solutions. The ideal candidate will have a strong background in artificial intelligence and machine learning, with excellent problem-solving skills. Roles and Responsibility Design and develop AI and ML models to solve complex problems. Collaborate with cross-functional teams to integrate AI and ML solutions into existing systems. Develop and maintain large-scale data pipelines and architectures. Conduct research and stay updated on the latest trends and technologies in AI and ML. Work closely with stakeholders to understand business requirements and develop tailored solutions. Ensure high-quality code and adhere to best practices. Job Requirements Strong proficiency in programming languages such as Python, Java, or C++. Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of computer vision, natural language processing, or reinforcement learning. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Ability to work in a fast-paced environment and meet deadlines.
Posted 1 week ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
You will be responsible for the deployment and maintenance of the group data science platform infrastructure, on which data science pipelines are deployed and scaled. To achieve this, you will collaborate with Data Scientists and Data Engineers from various business lines and the Global Technology Service infrastructure team (GTS). Roles : Implement techniques and processes for supporting the development and scaling of data science pipelines. Industrialize inference, retraining, monitoring data science pipelines, ensuring their maintainability and compliance. Provide platform support to end-users. Be attentive to the needs and requirements expressed by the end-users. Anticipate needs and necessary developments for the platform. Work closely with Data Scientists, Data Engineers, and business stakeholders. Stay updated and demonstrate a keen interest in the ML OPS domain. Environment : Cloud on-premise, AZure Python, Kubernetes Integrated vendor solutions: Dataiku, Snowflake DB: PostGreSQL Distributed computing: Spark Big Data: Hadoop, S3/Scality, MAPR Datascience: Scikit-learn, Transformers, ML Flow, Kedro, DevOps, CI/CD: JFROG, Harbor, Github Actions, Jenkins Monitoring: Elastic Search/Kibana, Grafana, Zabbix Agile ceremonies: PI planning, Sprint, Sprint Review, Refinement, Retrospectives, ITIL framework Technical Skills : Python FastApi, SqlAlchemy Numpy, Pandas, Scikit-learn, Transformers Kubernetes, Docker Pytest CI/CD: Jenkins, Ansible, GitHub Action, Harbor, Docker Soft Skills : Client Focus: Demonstrate strong listening skills, understanding, and anticipation of user needs. Team Spirit: Organize collaboration, workshops to find the best solutions. Share expertise with colleagues to find the most suitable solutions. Innovation: Propose innovative ideas, solutions, or strategies, and think out the box. Prefer simplicity over complexity. Responsibility: Take ownership, keep commitments and respect deadlines
Posted 1 week ago
1.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Developing Image Processing/Computer Vision Applications to be deployed in Vision Inspection. More specifically: Write OpenCV applications in C++ for dimensional and surface inspection. Train DL models for surface inspection in cases where traditional OpenCV / C++ is not viable. Additional Responsibilities: Developing and optimizing OpenCV applications for embedded devices (raspberry PI). Assist in developing DL models for the inhouse DL cloud platform. Educational Qualification B. E/ B. Tech in Any Specialization. Skills Python. C++. OpenCV. Linux. PyTorch/TensorFlow. Who can apply Are available for the full time employment with one year contract. Can start immediately. Have relevant skills and interests. Perks: Certificate. Letter of recommendation. Job offer. Group Health Insurance. Incentives and Bonus as per HR policy.
Posted 1 week ago
0.0 - 3.0 years
3 - 5 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Overview: We are looking for a curious, analytical, and technically skilled Data Science Engineer with 03 years of experience to join our growing data team. This role is ideal for recent graduates or junior professionals eager to work on real-world machine learning and data engineering challenges. You will help develop data-driven solutions, design models, and deploy scalable data pipelines that support business decisions and product innovation. Key Responsibilities: Assist in designing and deploying machine learning models and predictive analytics solutions. Build and maintain data pipelines using tools such as Airflow, Spark, or Pandas. Conduct data wrangling, cleansing, and feature engineering on large datasets. Collaborate with data scientists, analysts, and engineers to operationalize models in production. Develop dashboards, reports, or APIs to expose model insights to stakeholders. Continuously monitor model performance and data quality. Stay updated with new tools, technologies, and industry trends in AI and data science. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Statistics, Engineering, or a related field. 0–3 years of hands-on experience in data science, machine learning, or data engineering (internships and academic projects welcome). Proficiency in Python and data science libraries (e.g., pandas, NumPy, scikit-learn, matplotlib). Familiarity with SQL and working with relational databases. Understanding of fundamental machine learning concepts and algorithms. Knowledge of version control systems (e.g., Git). Strong problem-solving skills and a willingness to learn. Nice-to-Have: Exposure to ML frameworks like TensorFlow, PyTorch, or XGBoost. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with MLOps tools like MLflow, Kubeflow, or SageMaker. Understanding of big data tools (e.g., Spark, Hadoop). Experience working on data science projects or contributions on GitHub/Kaggle. What We Offer: Real-world experience with data science in production environments Mentorship and professional development support Access to modern tools, technologies, and cloud platforms Competitive salary with performance incentives A collaborative and learning-focused culture Flexible work options (remote/hybrid) How to Apply: Send your updated resume to careers@jasra.in
Posted 1 week ago
6.0 - 11.0 years
20 - 30 Lacs
Bhopal, Hyderabad, Pune
Hybrid
Hello Greetings from NewVision Software!! We are hiring on an immediate basis for the role of Senior / Lead Python Developer + AWS | NewVision Software | Pune, Hyderabad & Bhopal Location | Fulltime Looking for professionals who can join us Immediately or within 15 days is preferred. Please find the job details and description below. NewVision Software PUNE HQ OFFICE 701 &702, Pentagon Tower, P1, Magarpatta City, Hadapsar, Pune, Maharashtra - 411028, India NewVision Software The Hive Corporate Capital, Financial District, Nanakaramguda, Telangana - 500032 NewVision Software IT Plaza, E-8, Bawadiya Kalan Main Rd, near Aura Mall, Gulmohar, Fortune Pride, Shahpura, Bhopal, Madhya Pradesh - 462039 Senior Python and AWS Developer Role Overview: We are looking for a skilled senior Python Developer with strong background in AWS cloud services to join our team. The ideal candidate will be responsible for designing, developing, and maintaining robust backend systems, ensuring high performance and responsiveness to requests from the front end. Responsibilities : Develop, test, and maintain scalable web applications using Python and Django. Design and manage relational databases with PostgreSQL, including schema design and optimization. Build RESTful APIs and integrate with third-party services as needed. Work with AWS services including EC2, EKS, ECR, S3, Glue, Step Functions, EventBridge Rules, Lambda, SQS, SNS, and RDS. Collaborate with front-end developers to deliver seamless end-to-end solutions. Write clean, efficient, and well-documented code following best practices. Implement security and data protection measures in applications. Optimize application performance and troubleshoot issues as they arise. Participate in code reviews, testing, and continuous integration processes. Stay current with the latest trends and advancements in Python, Django, and database technologies. Mentor junior python developers. Requirements : 6+ years of professional experience in Python development. Strong proficiency with Django web framework. Experience working with PostgreSQL, including complex queries and performance tuning. Familiarity with RESTful API design and integration. Strong understanding of OOP, SOLID principles, and design patterns. Strong knowledge of Python multithreading and multiprocessing. Experience with AWS services: S3, Glue, Step Functions, EventBridge Rules, Lambda, SQS, SNS, IAM, Secret Manager, KMS and RDS. Understanding of version control systems (Git). Knowledge of security best practices and application deployment. Basic understanding of Microservices architecture. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Nice to Have Experience with Docker, Kubernetes, or other containerization tools. Good to have front-end technologies (React). Experience with CI/CD pipelines and DevOps practices. Experience with infrastructure as code tools like Terraform. Education : Bachelors degree in computer science engineering or related field (or equivalent experience). Do share your resume with my email address: imran.basha@newvision-software.com Please share your experience details: Total Experience: Relevant Experience: Exp: Python: Yrs, AWS: Yrs, PostgreSQL: Yrs Rest API: Yrs, Django: Current CTC: Expected CTC: Notice / Serving (LWD): Any Offer in hand: LPA Current Location Preferred Location: Education: Please share your resume and the above details for Hiring Process: - imran.basha@newvision-software.com
Posted 1 week ago
3.0 - 5.0 years
9 - 13 Lacs
Gurugram
Work from Office
Job Summary Synechron is seeking a detail-oriented Data Analyst to leverage advanced data analysis, visualization, and insights to support our business objectives. The ideal candidate will have a strong background in creating interactive dashboards, performing complex data manipulations using SQL and Python, and automating workflows to drive efficiency. Familiarity with cloud platforms such as AWS is a plus, enabling optimization of data storage and processing solutions. This role will enable data-driven decision-making across teams, contributing to strategic growth and operational excellence. Software Requirements Required: PowerBI (or equivalent visualization tools like Streamlit, Dash) SQL (for data extraction, manipulation, and querying) Python (for scripting, automation, and advanced analysis) Data management tools compatible with cloud platforms (e.g., AWS S3, Redshift, or similar) Preferred: Cloud platform familiarity, especially AWS services related to data storage and processing Knowledge of other visualization platforms (Tableau, Looker) Familiarity with source control systems (e.g., Git) Overall Responsibilities Develop, redesign, and maintain interactive dashboards and visualization tools to provide actionable insights. Perform complex data analysis, transformations, and validation using SQL and Python. Automate data workflows, reporting, and visualizations to streamline processes. Collaborate with business teams to understand data needs and translate them into effective visual and analytical solutions. Support data extraction, cleaning, and validation from various sources, ensuring data accuracy. Maintain and enhance understanding of cloud environments, especially AWS, to optimize data storage, processing pipelines, and scalability. Document technical procedures and contribute to best practices for data management and reporting. Performance Outcomes: Timely, accurate, and insightful dashboards and reports. Increased automation reducing manual effort. Clear communication of insights and data-driven recommendations to stakeholders. Technical Skills (By Category) Programming Languages: Essential: SQL, Python Preferred: R, additional scripting languages Databases/Data Management: Essential: Relational databases (SQL Server, MySQL, Oracle) Preferred: NoSQL databases like MongoDB, cloud data warehouses (AWS Redshift, Snowflake) Cloud Technologies: Essential: Basic understanding of AWS cloud services (S3, EC2, RDS) Preferred: Experience with cloud-native data solutions and deployment Frameworks and Libraries: Python: Pandas, NumPy, Matplotlib, Seaborn, Plotly, Streamlit, Dash Visualization: PowerBI, Tableau (preferred) Development Tools and Methodologies: Version control: Git Automation tools for workflows and reporting Familiarity with Agile methodologies Security Protocols: Awareness of data security best practices and compliance standards in cloud environments Experience Requirements 3-5 years of experience in data analysis, visualization, or related data roles. Proven ability to deliver insightful dashboards, reports, and analysis. Experience working across teams and communicating complex insights clearly. Knowledge of cloud environments like AWS or other cloud providers is desirable. Experience in a business environment, not necessarily as a full-time developer, but as an analytical influencer. Day-to-Day Activities Collaborate with stakeholders to gather requirements and define data visualization strategies. Design and maintain dashboards using PowerBI, Streamlit, Dash, or similar tools. Extract, transform, and analyze data using SQL and Python scripts. Automate recurring workflows and report generation to improve operational efficiencies. Troubleshoot data issues and derive insights to support decision-making. Monitor and optimize cloud data storage and processing pipelines. Present findings to business units, translating technical outputs into actionable recommendations. Qualifications Bachelors degree in Computer Science, Data Science, Statistics, or related field. Masters degree is a plus. Relevant certifications (e.g., PowerBI, AWS Data Analytics) are advantageous. Demonstrated experience with data visualization and scripting tools. Continuous learning mindset to stay updated on new data analysis trends and cloud innovations. Professional Competencies Strong analytical and problem-solving skills. Effective communication, with the ability to explain complex insights clearly. Collaborative team player with stakeholder management skills. Adaptability to rapidly changing data or project environments. Innovative mindset to suggest and implement data-driven solutions. Organized, self-motivated, and capable of managing multiple priorities efficiently.
Posted 1 week ago
6.0 - 9.0 years
15 - 25 Lacs
Pune, Chennai, Bengaluru
Hybrid
Hi Everyone, Experience : 6-9yrs Work Mode: Hybrid Work location : Chennai/Bangalore/Pune Location Notice Period : Immediate - 30 days Role : Data Scientist Skills and Experience: 5 year with Data science and ML exposure Key Accountabilities & Responsibilities Support the Data Science team with the development of advanced analytics/machine learning/artificial intelligence initiatives Analyzing large and complex datasets to uncover trends and insights. Supporting the development of predictive models and machine learning workflows. Performing exploratory data analysis to guide product and business decisions. Collaborating with cross-functional teams, including product, marketing, and engineering. Assisting with the design and maintenance of data pipelines. Clearly documenting and communicating analytical findings to technical and non-technical stakeholders. Basic Qualifications: Qualificiation in Data Science, Statistics, Computer Science, Mathematics, or a related field. Proficiency in Python and key data science libraries (e.g., pandas, NumPy, scikit-learn). Operational understanding of machine learning principles and statistical modeling. Experience with SQL for data querying. Strong communication skills and a collaborative mindset. Preferred Qualifications Exposure to cloud platforms such as AWS, GCP, or Azure. Familiarity with data visualization tools like Tableau, Power BI, or matplotlib. Participation in personal data science projects or online competitions (e.g., Kaggle). Understanding of version control systems like Git. Kindly, share the following details : Updated CV Relevant Skills Total Experience Current Company Current CTC Expected CTC Notice Period Current Location Preferred Location
Posted 1 week ago
3.0 - 6.0 years
10 - 20 Lacs
Gurugram
Hybrid
Role & responsibilities Highly focused individual with self-driven attitude Problem solving and logical thinking to automate and improve internal processes Using various tools such as SQL and Python for managing the various requirements for different data asset projects. Ability to diligently involve in activities like Data Cleaning, Retrieval, Manipulation, Analytics and Reporting Using data science and statistical techniques to build machine learning models and deal with textual data. Keep up-to-date knowledge of the industry and related markets Ability to multitask, prioritize, and manage time efficiently Understand needs of the hiring organization or client in order to target solutions to their benefit Advanced speaking and writing skills for effective communication Ability to work in cross functional teams demonstrating high level of commitment and coordination Attention to details and commitment to accuracy for the desired deliverable Should demonstrate and develop a sense of ownership towards the assigned task Ability to keep sensitive business information confidential Contribute, positively and extensively towards building the organizational reputation, brand and operational excellence Preferred candidate profile 3-6 years of relevant experience in data science Advanced knowledge of statistics and basics of machine learning Experienced in dealing with textual data and using natural language processing techniques Ability to conduct analysis to extract actionable insights Technical skills in Python (Numpy, Pandas, NLTK, transformers, Spacy), SQL and other programming languages for dealing with large datasets Experienced in data cleaning, manipulation, feature engineering and building models Experienced in the end-to-end development of a data science project Strong interpersonal skills and extremely resourceful Proven ability to complete assigned task according to the outlined scope and timeline Good language, communication and writing skills in English Expertise in using tools like MS Office, PowerPoint, Excel and Word Graduate or Post-graduate from a reputed college or university
Posted 1 week ago
5.0 - 8.0 years
12 - 13 Lacs
Bengaluru
Remote
5+ Years of Professional Experience as a Python Developer. Strong Proficiency in SQL Experience With Relational Databases Like PostgreSQL, MySQL, Or SQL Server. Knowledge Of Rest Apis,Json, Xml Data Structures. Experience With Version Control Systems
Posted 1 week ago
2.0 - 3.0 years
5 - 9 Lacs
Kochi, Coimbatore, Thiruvananthapuram
Work from Office
Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required
Posted 1 week ago
3.0 - 5.0 years
1 - 6 Lacs
Noida
Work from Office
Collaborate with teams to understand business needs, design and implement AI solutions, conduct thorough testing, optimize algorithms, stay updated with AI advancements, integrate technologies, and mentor team for innovation.
Posted 1 week ago
3.0 - 8.0 years
1 - 6 Lacs
Pune
Work from Office
Return to Work Program for Python Professionals: Location: Offline (Baner, Pune) Experience Required: 3+ years Program Duration: 3 Months Program Type: Free Training + Job Assistance (Not a Job Guarantee) Note: Candidate should be ready to learn new technologies. Restart Your Career in High-Demand Tech Fields! If you've experienced a career gap, layoff, or lost a job due to unforeseen circumstances, VishvaVidya's Return to Work Program offers a unique platform to relaunch your tech career with confidence. What We Offer: Free Technical Training: Upskill in Python, Generative AI, Data Science, and other relevant tools. Placement Assistance: Get connected with top hiring partners actively hiring returnees. Hands-on Learning: Work on real-world projects to bridge your experience gap. Mentorship & Confidence Building: Structured sessions to support your transition back to work. Zero Cost: The program is 100% free, fully sponsored by our hiring partners. Eligibility: Minimum 3 years of prior experience in Python development Career break of 6 months to 7 years welcome Eagerness to upskill and return to the workforce Availability for offline sessions in Baner, Pune Why Join VishvaVidyas Return to Work Program? Tailored for career restart seekers Trusted by top tech employers Industry-relevant curriculum curated by expert mentors Build portfolio-worthy projects and prepare for real-world job roles Why Choose VishvaVidya? We believe in second chances and career growth for everyone. Our fully sponsored program equips you with the skills, confidence, and opportunities needed to successfully re-enter the workforce. Apply Today Your next career chapter starts here!
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an AI Developer with 5-8 years of experience, you will be based in Pune with a hybrid working model. You should be able to join immediately or within 15 days. Your primary responsibility will be to develop and maintain Python applications, focusing on API building, data processing, and transformation. You will utilize Lang Graph to design and manage complex language model workflows and work with machine learning and text processing libraries to deploy agents. Your must-have skills include proficiency in Python programming with a strong understanding of object-oriented programming concepts. You should have extensive experience with data manipulation libraries like Pandas and NumPy to ensure clean, efficient, and maintainable code. Additionally, you will develop and maintain real-time data pipelines and microservices to ensure seamless data flow and integration across systems. When it comes to SQL, you are expected to have a strong understanding of basic SQL query syntax, including joins, WHERE, and GROUP BY clauses. Good-to-have skills include practical experience in AI development applications, knowledge of parallel processing and multi-threading/multi-processing to optimize data fetching and execution times, familiarity with SQLAlchemy or similar libraries for data fetching, and experience with AWS cloud services such as EC2, EKS, Lambda, and Postgres. If you are looking to work in a dynamic environment where you can apply your skills in Python, SQL, Pandas, NumPy, Agentic AI development, CI/CD pipelines, AWS, and Generative AI, this role might be the perfect fit for you.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Software Test Engineer at Trading Technologies, you will play a crucial role in ensuring the quality and functionality of our cutting-edge trading applications. Your main responsibilities will include designing, developing, and executing test plans and test cases based on software requirements and technical design specifications. You will collaborate with the Development team to investigate and debug software issues, recommend product improvements to the Product Management team, and constantly enhance your skills alongside a team of testers. Your expertise in testing multi-asset trade analytics applications, automated testing using Python or similar programming languages, and experience with cloud-based systems like AWS will be invaluable in this role. Knowledge of trade analytics standards such as pre- and post-Trade TCA, SEC & FINRA rule compliance, MiFID II, and PRIIPs analytics will be highly advantageous. Additionally, your understanding of performance and load testing for SQL queries and data pipelines will be essential. At Trading Technologies, we offer a competitive benefits package to support your well-being and growth. You will have access to medical, dental, and vision coverage, generous paid time off, parental leave, professional development opportunities, and wellness perks. Our hybrid work model allows for a balance between in-office collaboration and remote work, fostering team cohesion, innovation, and mentorship opportunities. Join our forward-thinking and inclusive culture that values diversity and promotes collaborative teamwork. Trading Technologies is a leading Software-as-a-Service (SaaS) technology platform provider in the global capital markets industry. Our TT platform connects to major international exchanges and liquidity venues, offering advanced tools for trade execution, order management, market data solutions, risk management, and more to a diverse client base. Join us in shaping the future of trading technology and delivering innovative solutions to market participants worldwide.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a data science expert, you will be responsible for developing strategies and solutions to address various problems using cutting-edge machine learning, deep learning, and GEN AI techniques. Your role will involve leading a team of data scientists to ensure timely and high-quality delivery of project outcomes. You will analyze large and complex datasets across different domains, perform exploratory data analysis, and select features to build and optimize classifiers and regressors. Enhancing data collection procedures, ensuring data quality and accuracy, and presenting analytical results to technical and non-technical stakeholders will be key aspects of your job. You will create custom reports and presentations with strong data visualization skills to effectively communicate analytical conclusions to senior company officials and other stakeholders. Proficiency in data mining, EDA, feature selection, model building, and optimization using machine learning and deep learning techniques is essential. Your primary skills should include a deep understanding and hands-on experience with data science and machine learning techniques, algorithms for supervised and unsupervised problems, NLP, computer vision, and GEN AI. You should also have expertise in building deep learning models for text and image analytics using frameworks like ANNs, CNNs, LSTM, Transfer Learning, Encoder, and decoder. Proficiency in programming languages such as Python, R, and common data science tools like NumPy, Pandas, Matplotlib, and frameworks like TensorFlow, Keras, PyTorch, XGBoost is required. Experience with statistical inference, hypothesis testing, and cloud platforms like Azure/AWS, as well as deploying models in production, will be beneficial for this role. Excellent communication and interpersonal skills are necessary to convey complex analytical concepts to diverse stakeholders effectively.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
We are looking for an experienced Django Rest Framework Developer to contribute to the enhancement and development of backend API services for a large-scale ERP software utilizing a microservices architecture. Your proficiency in Python, Django, Django Rest Framework, and relational databases such as PostgreSQL will be crucial for this role. The primary responsibility involves designing and implementing efficient, scalable, and secure APIs while leveraging tools like Celery, Kafka, Redis, Django Channels, Pandas, and NumPy. An in-depth understanding of ERP systems is essential as you will be working on modules that drive business-critical operations. Key Responsibilities: - Design, develop, and maintain backend APIs using Django Rest Framework for a large-scale ERP system. - Architect and implement a microservices architecture to ensure decoupled, scalable, and efficient backend services. - Integrate PostgreSQL databases for storing ERP data, focusing on data integrity and query optimization. - Implement background tasks and scheduling using Celery and Celery Beat for managing asynchronous workflows. - Leverage Kafka for messaging and event-driven architecture to enable reliable communication between microservices. - Utilize Redis for caching, session management, and API performance optimization. - Develop real-time communication features through Django Channels to handle WebSockets and async functionalities. - Manage data pipelines and perform data transformations using Pandas and NumPy. - Write clean, maintainable, and well-documented code following security and API design best practices. - Collaborate with frontend teams, database administrators, and DevOps engineers for seamless deployment and integration of services. - Troubleshoot and optimize API performance to enhance the efficiency of backend operations. - Participate in code reviews, testing, and documentation to ensure high-quality software delivery. - Stay updated on emerging technologies and industry trends relevant to ERP and backend development. Required Skills & Qualifications: - 3+ years of backend development experience using Django and Django Rest Framework. - Strong Python proficiency and familiarity with microservices architecture. - Extensive experience with PostgreSQL or other relational databases, including optimized query writing and database management. - Experience in handling asynchronous tasks with Celery and Celery Beat. - Familiarity with Kafka for building event-driven systems and inter-service communication. - Expertise in Redis for caching, pub/sub messaging, and system performance enhancement. - Hands-on experience with Django Channels for real-time communication and WebSocket management. - Proficiency in Pandas and NumPy for data processing, manipulation, and analysis. - Understanding of ERP systems and their modules to build relevant APIs efficiently. - Knowledge of RESTful API design principles, security best practices, and scalability patterns. - Proficiency in Docker and containerized deployments for both development and production environments. - Experience with Git and collaborative development workflows. - Strong problem-solving skills, debugging, and backend issue troubleshooting. - Experience with CI/CD pipelines for automated testing and deployment. - Familiarity with Kubernetes for managing containerized applications in production. - Knowledge of GraphQL for building flexible APIs. - Previous experience working on ERP software or other large-scale enterprise applications. Job Type: Full-time Location Type: In-person Schedule: Fixed shift Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): - Have you worked on ERPs before (Yes/No) - Have you led backend teams or managed end-to-end backend projects, from architecture to deployment (Yes/No) Experience: - Python: 3 years (Required) - Django: 3 years (Required) Work Location: In person,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
You have a solid working experience in Python-based Django and Flask frameworks, along with expertise in developing microservices based design and architecture. Your strong programming knowledge extends to Javascript, HTML5, Python, Restful API, and gRPC API. You have hands-on experience with object-oriented concepts in Python and are familiar with libraries like Numpy, Pandas, Ppen3D, OpenCV, and Matplotlib. Additionally, you possess knowledge of MySQL, Postgres, and MSSQL databases, as well as 3D geometry. Your expertise also includes familiarity with SSO/OpenID Connect/OAuth authentication protocols, version control systems like GitHub/BitBucket/GitLab, and continuous integration and continuous deployment (CI/CD) pipelines. You have a basic understanding of image processing, data analysis, and data science, coupled with strong communication skills and analytical thinking capabilities from various perspectives. As a proactive team player, you are inclined towards providing new ideas, suggestions, solutions, and constructive analysis of your team members" ideas. You thrive in a fast-paced, Agile software development environment and have a good-to-have knowledge of other programming languages like C, C++, basics of machine learning, exposure to NoSQL databases, and cloud platforms like GCP/AWS/Azure. In the area of Software Engineering, you apply scientific methods to analyze and solve software engineering problems, develop and apply software engineering practices and knowledge, and exercise original thought and judgement. You are responsible for supervising the technical and administrative work of other software engineers, enhancing your skills and expertise within the software engineering discipline. Working collaboratively with other software engineers and stakeholders, you contribute positively to project performance and make informed decisions based on situational understanding. With more than a year of relevant work experience, you possess a solid understanding of programming concepts, software design, and software development principles. You consistently deliver accurate and reliable results with minimal supervision, work on various tasks and problems, and demonstrate the application of your skills and knowledge effectively. By organizing your time efficiently to meet task deadlines, collaborating with team members to achieve common goals, and making decisions based on understanding rather than just rules, you have a direct and positive impact on project performance.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
indore, madhya pradesh
On-site
You should have 2-5 years of experience as a Python developer with a strong command of the Python programming language. Your knowledge should include Flask, Django, and Docker. It is essential to have good knowledge of interacting with database systems (SQL, NoSQL) and webservices (REST). An understanding of Machine Learning concepts is required. You must possess expert knowledge of Python and related frameworks such as Django and Flask. Practical experience in Deep Learning using PyTorch or TensorFlow, focusing on model fine-tuning, is a plus. Utilizing Pandas and Python to manipulate and analyze large datasets to extract actionable insights is a key aspect of the role. A deep understanding of multi-process architecture and the threading limitations of Python is necessary. A Bachelor's degree in computer science, computer engineering, or a related field is preferred. If you meet the requirements mentioned above, please send your resume to hrd@5exceptions.com.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Analyst at our organization, you will play a vital role in analyzing our ecommerce clickstream and user behavior data to uncover actionable insights that drive business growth and inform our product roadmap. Leveraging your expertise in tools such as Adobe Analytics, GCP BigQuery, SQL, and Python, you will extract, analyze, and visualize complex datasets. Working closely with our Senior Data Analyst/Architect, you will contribute to the strategic direction of our product development efforts. Your responsibilities will include analyzing ecommerce clickstream data to understand user journeys and optimize website performance, investigating user behavior patterns to enhance customer interactions, and utilizing SQL and Python to extract and manipulate large datasets from our data warehouse. Additionally, you will create insightful dashboards and reports using data visualization tools to effectively communicate findings to stakeholders across different teams. Collaborating with the Senior Data Analyst/Architect, you will translate data-driven insights into actionable recommendations that contribute to the development and prioritization of our product roadmap. You will define and track key performance indicators (KPIs), conduct AB testing analysis, and collaborate with cross-functional teams to provide relevant insights. Ensuring data quality and integrity, staying updated on industry trends, and continuously learning best practices in ecommerce analytics will also be key aspects of your role. To qualify for this position, you should have a Bachelor's degree in a quantitative field and at least 2 years of experience as a Data Analyst, preferably with a focus on ecommerce analytics. Hands-on experience with Adobe Analytics, strong proficiency in SQL, and solid programming skills in Python are essential. Experience with cloudbased data warehouses, data visualization tools, and strong analytical and problemsolving skills are also required. Excellent communication and presentation skills, the ability to work independently and collaboratively in a fast-paced environment, and a passion for data science and ecommerce are important attributes for success in this role. Key Skills: Adobe Analytics, GCP BigQuery, SQL, Python (Pandas, NumPy, Matplotlib, Seaborn), Ecommerce Analytics, User Behavior Analysis, Clickstream Data Analysis, Data Visualization, Data Reporting, AB Testing Analysis, Product Roadmap Contribution, Statistical Analysis, Data Mining, Communication (Written, Verbal), Problem-Solving, Teamwork Mandatory Skills: Data Science, E-commerce,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough