Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Title: Senior Data Engineer – Data Quality, Ingestion & API Development Mandatory skill set - Python, Pyspark, AWS, Glue , Lambda, CI CD Total experience - 8+ Relevant experience - 8+ Work Location - Trivandrum /Kochi Candidates from Kerala and Tamil Nadu prefer more who are ready to relocate to above work locations. Candidates must be having an experience in lead role related to Data Engineer Job Overview We are seeking an experienced Senior Data Engineer to lead the development of a scalable data ingestion framework while ensuring high data quality and validation. The successful candidate will also be responsible for designing and implementing robust APIs for seamless data integration. This role is ideal for someone with deep expertise in building and managing big data pipelines using modern AWS-based technologies, and who is passionate about driving quality and efficiency in data processing systems. Key Responsibilities • Data Ingestion Framework: o Design & Development: Architect, develop, and maintain an end-to-end data ingestion framework that efficiently extracts, transforms, and loads data from diverse sources. o Framework Optimization: Use AWS services such as AWS Glue, Lambda, EMR, ECS , EC2 and Step Functions to build highly scalable, resilient, and automated data pipelines. • Data Quality & Validation: o Validation Processes: Develop and implement automated data quality checks, validation routines, and error-handling mechanisms to ensure the accuracy and integrity of incoming data. o Monitoring & Reporting: Establish comprehensive monitoring, logging, and alerting systems to proactively identify and resolve data quality issues. • API Development: o Design & Implementation: Architect and develop secure, high-performance APIs to enable seamless integration of data services with external applications and internal systems. o Documentation & Best Practices: Create thorough API documentation and establish standards for API security, versioning, and performance optimization. • Collaboration & Agile Practices: o Cross-Functional Communication: Work closely with business stakeholders, data scientists, and operations teams to understand requirements and translate them into technical solutions. o Agile Development: Participate in sprint planning, code reviews, and agile ceremonies, while contributing to continuous improvement initiatives and CI/CD pipeline development (using tools like GitLab). Required Qualifications • Experience & Technical Skills: o Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development. o Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation. o AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks. o Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift. o API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems. o CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies. • Soft Skills: o Strong problem-solving abilities and attention to detail. o Excellent communication and interpersonal skills with the ability to work independently and collaboratively. o Capacity to quickly learn and adapt to new technologies and evolving business requirements. Preferred Qualifications • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. • Experience with additional AWS services such as Kinesis, Firehose, and SQS. • Familiarity with data lakehouse architectures and modern data quality frameworks. • Prior experience in a role that required proactive data quality management and API- driven integrations in complex, multi-cluster environments. Candidate those who are Interested please drop your resume to: gigin.raj@greenbayit.com MOB NO - 8943011666 Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Company Profile Our client is a global IT services company that helps businesses with digital transformation with offices in India and the United States. It helps businesses with digital transformation, provide IT collaborations and uses technology, innovation, and enterprise to have a positive impact on the world of business. With expertise is in the fields of Data, IoT, AI, Cloud Infrastructure and SAP, it helps accelerate digital transformation through key practice areas - IT staffing on demand, innovation and growth by focusing on cost and problem solving. Location & work – New Delhi (On-Site), WFO Employment Type - Full Time Profile – AI/ML Engineer Preferred experience – 3-5 Years The Role: We are seeking a highly skilled AI/ML Engineer with strong expertise in traditional statistical modeling using R and end-to-end ML pipeline configuration on Databricks. The ideal candidate will play a key role in designing, developing, and deploying advanced machine learning models, optimizing performance, and ensuring scalability across large datasets on the Databricks platform. Responsibilities: Design and implement traditional ML models using R (e.g., regression, classification, clustering, time-series). Develop and maintain scalable machine learning pipelines on Databricks. Configure and manage Databricks workspaces, clusters, and MLflow integrations for model versioning and deployment. Collaborate with data engineers, analysts, and domain experts to collect, clean, and prepare data. Optimize models for performance, interpretability, and business impact. Automate data workflows and model retraining pipelines using Databricks notebooks and job scheduling. Monitor model performance in production and implement enhancements as needed. Ensure model explainability, compliance, and reproducibility in production environments. Must-Have Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Minimum 3+ years of experience in machine learning and data science roles. Strong proficiency in R for statistical modeling and traditional ML techniques. Hands-on experience with Databricks: cluster configuration, workspace management, notebook workflows, and performance tuning. Experience with MLflow, Delta Lake, and PySpark (optional but preferred). Strong understanding of MLOps practices, model lifecycle management, and CI/CD for ML. Familiarity with cloud platforms such as Azure Databricks, AWS, or GCP. Preferred Qualification: Certification in Databricks or relevant ML/AI platforms is a plus. Excellent problem-solving and communication skills. Application Method Apply online on this portal or on email at careers@speedmart.co.in Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Company Profile Our client is a global IT services company that helps businesses with digital transformation with offices in India and the United States. It helps businesses with digital transformation, provide IT collaborations and uses technology, innovation, and enterprise to have a positive impact on the world of business. With expertise is in the fields of Data, IoT, AI, Cloud Infrastructure and SAP, it helps accelerate digital transformation through key practice areas - IT staffing on demand, innovation and growth by focusing on cost and problem solving. Location & work – New Delhi (On –Site), WFO Employment Type - Full Time Profile – Platform Engineer Preferred experience – 3-5 Years The Role: We are looking for a highly skilled Platform Engineer to join our infrastructure and data platform team. This role will focus on the integration and support of Posit integration for data science workloads, managing R language environments, and leveraging Kubernetes to build scalable, reliable, and secure data science infrastructure. Responsibilities: Integrate and manage Posit Suite (Workbench, Connect, Package Manager) within containerized environments. Design and maintain scalable R environment integration (including versioning, dependency management, and environment isolation) for reproducible data science workflows. Deploy and orchestrate services using Kubernetes, including Helm-based Posit deployments. Automate provisioning, configuration, and scaling of infrastructure using IaC tools (Terraform, Ansible). Collaborate with Data Scientists to optimize R runtimes and streamline access to compute resources. Implement monitoring, alerting, and logging for Posit components and Kubernetes workloads. Ensure platform security and compliance, including authentication (e.g., LDAP, SSO), role-based access control (RBAC), and network policies. Support continuous improvement of DevOps pipelines for platform services. Must-Have Qualifications ● Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Minimum 3+ years of experience in platform, DevOps, or infrastructure engineering. Hands-on experience with Posit (RStudio) products including deployment, configuration, and user management. Proficiency in R integration practices in enterprise environments (e.g., dependency management, version control, reproducibility). Strong knowledge of Kubernetes, including Helm, pod security, and autoscaling. Experience with containerization tools (Docker, OCI images) and CI/CD pipelines. Familiarity with monitoring tools (Prometheus, Grafana) and centralized logging (ELK, Loki). Scripting experience in Bash, Python, or similar. Preferred Qualifications Experience with cloud-native Posit deployments on AWS, GCP, or Azure. Familiarity with Shiny apps, RMarkdown, and their deployment through Posit Connect. Background in data science infrastructure, enabling reproducible workflows across R and Python. Exposure to JupyterHub or similar multi-user notebook environments. Knowledge of enterprise security controls, such as SSO, OAuth2, and network segmentation. Application Method Apply online on this portal or on email at careers@speedmart.co.in Show more Show less
Posted 3 days ago
0 years
0 Lacs
India
On-site
We're looking for an individual with Prior experience building RESTful APIs. Experience with at least one of the backend API frameworks (preferably FastAPI) Hands-on experience with at least one modern ML/AI framework (PyTorch, TensorFlow) Experience integrating LLMs into applications (OpenAI, Anthropic, or open-source) Strong foundation in deep learning concepts and model training workflows Database expertise: schema design, query optimization, both SQL and NoSQL Experience with vector databases (Pinecone, Weaviate, Chroma) for RAG applications Mathematical aptitude: comfortable with statistics, linear algebra, and algorithmic thinking Your daily work would include Working on product features end-to-end (write backend logic, write API, connect APIs in the front end, and build a basic functional UI) Working with RESTful APIs, SQL, and NoSQL databases Training, fine-tuning, and deploying deep learning models for production use Building robust LLM integration pipelines with proper error handling and fallback mechanisms Implementing MLOps practices: model versioning, A/B testing, monitoring, and automated retraining Working on multiple projects during the tenure across multiple domains Note: The working hours are flexible. We are a small team. Most of the work will happen asynchronously. Culture-wise, we are looking for people who are excited to learn new things, think creatively, are capable of figuring things out on their own (most of the time), and have a "getting things done" kind of attitude. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title Senior E-learning Storyline Developer Location: Remote Opportunity Experience 4–6 Years Company Bell Immersive Technologies Driving transformation in corporate learning through immersive, interactive e-learning solutions. Location: Remote (India-based candidates preferred) Experience Required 4 to 6 years in e-learning development, with strong hands-on experience in Articulate Storyline 360 Overall Role Objective We're seeking a Senior E-learning Storyline Developer who blends creative design with technical e-learning development expertise. The ideal candidate will be responsible for creating engaging, interactive, and instructionally sound e-learning modules. You'll work closely with the Instructional Design team, Project Managers, and SMEs to bring training content to life for corporate learners across domains. Key Responsibilities Content Development Develop high-quality e-learning modules, assessments, and simulations aligned with learning objectives Use multimedia elements such as graphics, animations, videos, voiceovers, and interactivities to create immersive content Instructional Design Application Collaborate with Instructional Designers to ensure sound pedagogy and logical flow in course structure Apply adult learning principles and modern instructional frameworks to enhance learner engagement Tools & Technology Expertly use Articulate Storyline 360 (mandatory), Rise, and Adobe Creative Suite Basic familiarity with tools like Camtasia, Vyond, and LMS platforms is an advantage Stay updated with emerging trends in e-learning and recommend suitable tools/techniques Collaboration & Stakeholder Communication Work with SMEs, trainers, and project stakeholders to understand content needs and project scope Provide input during storyboard reviews and adapt content for technical feasibility Quality Assurance Review, test, and troubleshoot courses for cross-browser/device compatibility, navigation issues, and SCORM/LMS compliance Ensure consistency with design guidelines, accessibility standards, and user experience best practices Project Ownership Independently manage multiple e-learning development projects Track timelines and ensure delivery within agreed schedules and scope Communicate progress updates and blockers to project teams proactively Feedback & Iteration Integrate feedback from reviewers and learners to continually improve content Support content versioning and change tracking for iterative delivery Required Skills & Qualifications 4–6 years of experience as an E-learning Developer with a strong focus on Articulate Storyline 360 Proven ability to transform instructional content into high-impact interactive learning Ability to manage multiple projects and meet tight deadlines Excellent attention to detail, especially for user experience, visual alignment, and interactivity Strong communication and collaboration skills Work Hours & Flexibility Remote work setup with flexible hours Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
Haveli, Maharashtra, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Tech Lead - .Net Core with React Job Date: May 16, 2025 Job Requisition Id: 61036 Location: Pune, IN Hyderabad, TG, IN Indore, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire .Net Core Professionals in the following areas : Experience 8-12 Years Job Description Key Skills – Mandatory Technology Skills: Should have 8 to 12 years of industry experience. Proven experience as a .NET Lead. Should be able to manage team and should be able to resolve their issues. Expert in ASP.NET framework 4.8, MS .NET 5.0/6.0/7.0, C#, SQL Server, and design/architectural patterns (e.g., Model-View-Controller MVC)), NodeJS, State Management, Unit Testing, Authentication & Authorization. Good experience in Database, API development (Restful, SOAP, Webservices, Graph QL, microservices). Experience in UI technologies like HTML5, CSS3, JavaScript, jQuery, React.js. Working Knowledge of Azure Development/deployment environment e.g., Azure Functions, App Services, Blob Storage, Queues, Event hubs Working Knowledge of Containerized application, Docker, Kubernetes, AKS, ACR etc. Good debugging skills. Good knowledge of OOPS. Proficient understanding of code versioning tools -TFS / GIT / Azure DevOps. Familiarity with architecture styles/APIs (REST, RPC). Excellent troubleshooting and communication skills. Write clean, scalable code using .NET programming languages. Ability to understand and adhere to the application architecture and design. Deployment knowledge - IIS knowledge, port mapping, routing. Experience with Agile Development, SCRUM, or Extreme Programming methodologies Other Mandatory Aspects: Early/ immediate joiners are welcome. Should be able to complete the assigned tasks timely. Personal Skills: Good communication skills (articulation using verbal & non-verbal skills, clarity of thought). Attention to details. Integrity & Stretch Mindset. Ownership of work and working independently. Flexible and Teamwork mindset. Strong analytical thinking and problem-solving skills. Ensuring quality and timely delivery. Required Technical/ Functional Competencies Requirement Gathering And Analysis: Extract requirements for complex scenarios and prototype independently. Identify impacted modules/features/functionalities and provide high-level estimates. Develop traceability matrix and identify transition requirements. Application Design: Good knowledge of design principles and performance engineering concepts. Able to create UI/Design and business logic elements, navigation, screen flow, and layout based on applicable criteria and constraints. Identify and apply design standards following applicable criteria and constraints. Architecture Tools And Frameworks: Familiarity with industry tools and frameworks, analyze and use them based on customer requirements. Work with SMEs to explore and implement new tools/frameworks. Estimation And Resource Planning: Identify and assign resources required to complete tasks. Use appropriate estimation models for medium-high complexity scenarios. Track and report gaps between budgeted and actual spending. Product/ Technology Knowledge: Implement code or configure/customize products, drive adoption of industry standards and practices, contribute to development of reusable assets and innovative solutions. Analyze frameworks/tools and present recommendations, develop of training and certification material, and demonstrate thought leadership through whitepapers and webinars. Test Management: Create iteration and system integration test plan. Develop and review test cases, conduct unit testing, define metrics, and support testing processes. Able to conduct RCA, verify system builds and test environments, and create business scenario test cases/automation test scripts. Customer Management: Use latest technology, communicate effectively, demonstrate leadership, present technical offerings, and proactively suggest solutions. Project Management: Working knowledge of project management process, tools, and templates. Execute medium projects effectively, create/ review milestone/metric reports, project status, closure reports, create continuous quality improvement plan, and provide inputs for organization-wide process assets. Domain/ Industry Knowledge: Apply industry standards and practices, creating complex business models in line with customer requirements independently. Analyze current-state and define to-be processes in collaboration with SMEs, present recommendations with tangible benefits. Drive process improvement initiatives, ROI analysis through innovation. Marketing: Basic knowledge of Marketing, understand Market Trends and Conduct Market Research. Source relevant Market Data and Prepare Report. Write Blogs and Participate in External Forums. Pre-Sales: Good knowledge of bid process and understanding of RFP/RFI’s. Prepare Response documents to Medium Scale Bids. Work with Sales Team to ensure successful closure of sales process. Attend to customer requests for information on RFI’s and assist Technical Team with sales enquiries. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Profile Requirements Experience as full stack / back end developer with expertise in Python/Node JS/Java/PHP. Understanding of data analytics, web analytics, and key performance indicators. Strong unit test and debugging skills. Good in Data Structure and algorithms Expertise in AWS/ GCP/ Azure or any cloud provider Proficient understanding of code versioning tools such as Github / Gitlab. Understanding of fundamental design principles behind a scalable application. Knowledge of Javascript/ Jquery/ HTML/ CSS/ AngularJS/ ReactJS Should have worked on Django/ Flask/ Spring/ Hibernate/ CodeIgniter Proficiency in Python or PHP software development on Linux platforms, or on Windows Should be able to work with minimal supervision Responsibilities And Duties Write reusable, testable, and efficient code. Implement security and data protection. Design and implementation of data storage solutions such as databases. Explore design implications and work towards appropriate balance between functionality and performance Work with a cross-discipline team of User Experience, Visual Design, Project Management, Development and testing. Writing White Papers and documenting the findings, learnings and challenges during the projects. Lead a team, resolves issues and effectively manage employee discipline. Play a key role in recruitment and campus drives. Required Experience, Skills And Qualifications Bachelors/ Masters in Computer Science or Electronics. Hands on experience in either MEAN stack or : Data Structures, Algorithms, Python, MEAN stack, Django, React.js (ref:hirist.tech) Show more Show less
Posted 4 days ago
3.0 - 4.0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Job Title : Data Scientist - Computer Vision & Generative AI. Location : Mumbai. Experience Level : 3 to 4 years. Employment Type : Full-time. Industry : Renewable Energy / Solar Services. Job Overview We are seeking a talented and motivated Data Scientist with a strong focus on computer vision, generative AI, and machine learning to join our growing team in the solar services sector. You will play a pivotal role in building AI-driven solutions that transform how solar infrastructure is analyzed, monitored, and optimized using image-based intelligence. From drone and satellite imagery to on-ground inspection photos, your work will enable intelligent automation, predictive analytics, and visual understanding in critical areas like fault detection, panel degradation, site monitoring, and more. If you're passionate about working at the cutting edge of AI for real-world sustainability impact, we'd love to hear from you. Key Responsibilities Design, develop, and deploy computer vision models for tasks such as object detection, classification, segmentation, anomaly detection, etc. Work with generative AI techniques (e.g. , GANs, diffusion models) to simulate environmental conditions, enhance datasets, or create synthetic training data. Build ML pipelines for end-to-end model training, validation, and deployment using Python and modern ML frameworks. Analyze drone, satellite, and on-site images to extract meaningful insights for solar panel performance, wear-and-tear detection, and layout optimization. Collaborate with cross-functional teams (engineering, field ops, product) to understand business needs and translate them into scalable AI solutions. Continuously experiment with the latest models, frameworks, and techniques to improve model performance and robustness. Optimize image pipelines for performance, scalability, and edge/cloud deployment. Key Requirements 3-4 years of hands-on experience in data science, with a strong portfolio of computer vision and ML projects. Proven expertise in Python and common data science libraries : NumPy, Pandas, Scikit-learn, etc. Proficiency with image-based AI frameworks : OpenCV, PyTorch or TensorFlow, Detectron2, YOLOv5/v8, MMDetection, etc. Experience with generative AI models like GANs, Stable Diffusion, or ControlNet for image generation or augmentation. Experience building and deploying ML models using MLflow, TorchServe, or TensorFlow Serving. Familiarity with image annotation tools (e.g. , CVAT, Labelbox), and data versioning tools (e.g. , DVC). Experience with cloud platforms (AWS, GCP, or Azure) for storage, training, or model deployment. Experience with Docker, Git, and CI/CD pipelines for reproducible ML workflows. Ability to write clean, modular code and a solid understanding of software engineering best practices in AI/ML projects. Strong problem-solving skills, curiosity, and ability to work independently in a fast-paced environment. Bonus / Preferred Skills Experience with remote sensing and working with satellite or drone imagery. Exposure to MLOps practices and tools like Kubeflow, Airflow, or SageMaker Pipelines. Knowledge of solar technologies, photovoltaic systems, or renewable energy is a plus. Familiarity with edge computing for vision applications on IoT devices or drones. (ref:hirist.tech) Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Greater Chennai Area
On-site
Why CDM Smith? Check out this video and find out why our team loves to work here! Join Us! CDM Smith – where amazing career journeys unfold. Imagine a place committed to offering an unmatched employee experience. Where you work on projects that are meaningful to you. Where you play an active part in shaping your career journey. Where your co-workers are invested in you and your success. Where you are encouraged and supported to do your very best and given the tools and resources to do so. Where it’s a priority that the company takes good care of you and your family. Our employees are the heart of our company. As an employer of choice, our goal is to provide a challenging, progressive and inclusive work environment which fosters personal leadership, career growth and development for every employee. We value passionate individuals who challenge the norm, deliver world-class solutions and bring diverse perspectives. Join our team, and together we will make a difference and change the world. Job Description CDM Smith is seeking an Artificial Intelligence/Machine Learning Engineer to join our Digital Engineering Solutions team. This individual will be part of the Data Technology group within the Digital Engineering Solutions team, helping to drive strategic Architecture, Engineering and Construction (AEC) initiatives using cutting-edge data technologies and analytics to deliver actionable business insights and robust solutions for AEC professionals and client outcomes. The Data Technology group will lead the firm in AEC-focused Business Intelligence and data services by providing architectural guidance, technological vision, and solution development. The Data Technology group will specifically utilize advanced analytics, data science, and AI/ML to give our business and our products a competitive advantage. It includes understanding and managing the data, how it interconnects, and architecting & engineering data for self-serve BI and BA opportunities. This position is for a person who has demonstrated excellence in AI/ML engineering capabilities, experienced with data technology and processes, and enjoys framing a problem, shaping, and creating solutions, and helping to lead and champion implementation. As a member of the Digital Engineering Solutions team, the Data Technology group will also engage in research and development and provide guidance and oversight to the AEC practices at CDM Smith, engaging in new product research, testing, and the incubation of data technology-related ideas that arise from around the company. Key Responsibilities Contributes to advanced analytics and uses artificial intelligence (AI) and machine learning (ML) solution techniques that address complex business challenges, particularly within the AEC domain. Apply state-of-the-art algorithms and techniques such as deep learning, NLP, computer vision, and time-series analysis for domain-specific use cases. Analyzes large datasets to identify patterns and trends. Participates in the testing and validation of AI model accuracy and reliability to ensure models perform in line with business requirements and expectations. Assist with AI/ML workflows optimization by implementing MLOps practices, including CI/CD pipelines, model retraining, and version control. Collaborate with Data Engineers, Data Scientists, and other stakeholders to design and implement end-to-end AI/ML solutions. Stay abreast of the latest developments and advancements, including new and emerging technologies & best practices and new tools & software applications and how they could impact CDM Smith. Assist with the development of documentation, standards, best practices, and workflows for data technology hardware/software in use across the business. Performs other duties as required. Skills And Abilities Good understanding of the software development life cycle. Basic experience with building and deploying machine learning models using frameworks such as TensorFlow, PyTorch, or Scikit-learn. Basic experience with cloud-based AI/ML services, particularly in Microsoft Azure and Databricks. Basic experience with programming languages (ex: R, Python, Scala, etc.). Knowledge of MLOps practices, including automated pipelines, model versioning, monitoring, and lifecycle management. Knowledge of data privacy, security, and ethical AI principles, ensuring compliance with relevant standards. Excellent problem-solving and critical thinking skills to identify and address technical challenges effectively. Strong critical thinking skills to generate innovative solutions and improve business processes. Ability to effectively communicate complex technical concepts to both technical and non-technical audiences. Detail oriented with the ability to assist with executing highly complex or specialized projects. Minimum Qualifications Bachelor’s degree. 0 – 2 years of related experience. Equivalent additional related experience will be considered in lieu of a degree. Amount Of Travel Required 0% Background Check and Drug Testing Information CDM Smith Inc. and its divisions and subsidiaries (hereafter collectively referred to as “CDM Smith”) reserves the right to require background checks including criminal, employment, education, licensure, etc. as well as credit and motor vehicle when applicable for certain positions. In addition, CDM Smith may conduct drug testing for designated positions. Background checks are conducted after an offer of employment has been made in the United States. The timing of when background checks will be conducted on candidates for positions outside the United States will vary based on country statutory law but in no case, will the background check precede an interview. CDM Smith will conduct interviews of qualified individuals prior to requesting a criminal background check, and no job application submitted prior to such interview shall inquire into an applicant's criminal history. If this position is subject to a background check for any convictions related to its responsibilities and requirements, employment will be contingent upon successful completion of a background investigation including criminal history. Criminal history will not automatically disqualify a candidate. In addition, during employment individuals may be required by CDM Smith or a CDM Smith client to successfully complete additional background checks, including motor vehicle record as well as drug testing. Agency Disclaimer All vendors must have a signed CDM Smith Placement Agreement from the CDM Smith Recruitment Center Manager to receive payment for your placement. Verbal or written commitments from any other member of the CDM Smith staff will not be considered binding terms. All unsolicited resumes sent to CDM Smith and any resume submitted to any employee outside of CDM Smith Recruiting Center Team (RCT) will be considered property of CDM Smith. CDM Smith will not be held liable to pay a placement fee. Business Unit COR Group COR Assignment Category Fulltime-Regular Employment Type Regular Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The versioning job market in India is currently thriving with numerous opportunities for skilled professionals. Versioning plays a crucial role in software development, ensuring that code changes are tracked, managed, and deployed efficiently. Job seekers in India looking to pursue a career in versioning can find a variety of roles across different industries.
The average salary range for versioning professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 3-5 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the field of versioning, a typical career path may include roles such as: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
Apart from expertise in versioning tools like Git, professionals in this field may also be expected to have knowledge and experience in: - Continuous Integration/Continuous Deployment (CI/CD) - DevOps practices - Programming languages like Python, Java, or JavaScript - Cloud computing platforms like AWS or Azure
As you navigate the versioning job market in India, remember to continuously upskill, practice your technical knowledge, and showcase your expertise confidently during interviews. With determination and preparation, you can excel in your versioning career and secure exciting opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2