Riktam Technology Consulting focuses on providing solutions in software development, IT consulting, and technology strategy.
Chennai
INR 0.9 - 1.75 Lacs P.A.
Remote
Full Time
Job Title: Machine Learning Engineer Location: Remote | Full-Time We are seeking a Machine Learning Engineer to design and deploy scalable ML pipelines using Azure and Snowflake. You will collaborate with data scientists, productionize models, and manage full ML lifecycles in cloud environments. Key Responsibilities: Build end-to-end ML pipelines: data ingestion, training, validation, deployment. Collaborate with data scientists to integrate models into production Use Azure ML, Data Factory, and Snowflake for pipeline orchestration Optimize ML models for performance and scalability Monitor, maintain, and improve deployed models Write clean, maintainable Python code and follow best practices Review code and mentor junior team members Must-Have Skills : Strong experience building & deploying ML pipelines Proficiency in Azure ML, ADF, and Snowflake Solid Python skills with ML libraries (scikit-learn, TensorFlow, etc.) Understanding of ML algorithms, evaluation, and tuning Familiarity with Git, CI/CD, and cloud-based workflows Preferred Qualifications: Experience with MLOps tools and practices Knowledge of Docker, Kubernetes Excellent problem-solving and communication skills .
Noida, Gurugram, Delhi / NCR
INR 8.5 - 18.0 Lacs P.A.
Work from Office
Full Time
please attach your upated resume if interested on the below role Job Title: Associate / Data Analyst / Sr. Data Analyst Function: Data & Analytics Location: Gurgoan DLF Cyber Hub Experience Required: 3to5 years for Sr. Data Analyst About the Role We are seeking highly motivated individuals for the position of Associate Data Analyst / Data Analyst / Sr. Data Analyst to join our global Data & Analytics team. You will work on high-impact projects, collaborating with cross-functional departments like Marketing, Merchandising, Technology, and Business Units to deliver analytical solutions that drive business performance. This is an excellent opportunity to work in a fast-paced, data-driven environment and make a measurable impact through insights and automation. Key Responsibilities Data & Insights Clean, process, and analyze large datasets to ensure data quality and consistency Identify trends and patterns to provide actionable business insights Create business requirement documents, use cases, user stories, and test cases for analytics projects Design and build impactful dashboards and data visualizations using BI tools Provide ad-hoc data analysis and reports for leadership Operational Excellence Improve data quality through automation and validation tools Develop user-centric dashboards and analytical solutions Benchmark industry practices to create business performance improvement strategies Stakeholder Management Collaborate with data engineers, consultants, and cross-functional teams throughout the analytics lifecycle Communicate insights and complex data concepts to non-technical stakeholders Prepare and maintain comprehensive documentation of analytics solutions Share key learnings across teams to enhance overall data maturity Desired Candidate Profile Education Bachelors Degree in Computer Science, Information Management, Statistics, or a related technical field Technical Skills Strong knowledge of SQL, Python, PySpark Experience with data visualization tools like Power BI, Tableau, Alteryx Understanding of statistical modeling, time series analysis Experience with relational (MySQL, MS SQL) and non-relational (MongoDB, DynamoDB) databases Familiarity with cloud platforms (Azure, AWS, GCP) for analytics Soft Skills Strong analytical and problem-solving skills Excellent communication and stakeholder management Agile and innovative mindset Focus on delivery excellence
New Delhi, Gurugram, Greater Noida
INR 22.5 - 35.0 Lacs P.A.
Work from Office
Full Time
Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 7–9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark. Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse. Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Preferred Skills: Strong knowledge of data engineering concepts (data pipeline creation, data warehousing, data marts/cubes, data reconciliation and audit, data management). Working knowledge of DevOps processes (CI/CD), Git/Jenkins version control tools, Master Data Management (MDM), and data quality tools. Strong experience in ETL/ELT development, QA, and operations/support processes (RCA of production issues, code/data fix strategy, monitoring and maintenance). Hands-on experience with databases like Azure SQL DB, Snowflake, MySQL, Cosmos DB, Blob Storage, Python/Unix Shell scripting. ADF, Databricks, and Azure certifications are a plus. Technologies We Use: Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, scripting (PowerShell, Bash), Git, Terraform, Power BI Responsibilities: Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms. Lead the technical execution of non-domain-specific initiatives (e.g., reusable dimensions, TLOG standardization, enablement pipelines). Architect data models and reusable layers consumed by multiple downstream pods. Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks. Mentor and coach team members. Partner with product and platform leaders to ensure engineering consistency and delivery excellence. Act as an L3 escalation point for operational data issues impacting foundational pipelines. Own engineering best practices, sprint planning, and quality across the Enablement pod. Contribute to platform discussions and architectural decisions across regions.
Noida, Gurugram, Delhi / NCR
INR 22.5 - 27.5 Lacs P.A.
Work from Office
Full Time
Key Responsibilities Analytics (Data & Insights): Analyze category and activity performance using advanced analytics methods. Deliver actionable insights using transactional, financial, and customer data. Design and measure experiments and pilots. Lead projects in clustering, forecasting, and causal impact analysis. Build intuitive dashboards to represent insights effectively. Operational Excellence: Improve data quality with automation tools. Develop analytical solutions and dashboards using user-centric design. Benchmark against industry standards and enhance business performance. Stakeholder Management: Collaborate with consultants, engineers, and cross-functional teams. Communicate complex insights in an understandable format. Create clear documentation linking business needs to data solutions. Promote a data-driven culture and share learnings across teams. Qualifications & Experience Education: Bachelors degree in Finance, Mathematics, Statistics, Engineering, or a related analytical discipline. Experience: 7+ years in a data analytics or quantitative role. Experience with Python, SQL, Spark, and handling large data volumes. Experience leading projects or small teams. Strong communication skills (English). Behavioral Competencies: Delivery excellence Innovation and agility Business acumen Social intelligence Technical Knowledge & Tools Retail, Supply Chain, Marketing, Customer Analytics Statistical Modeling & Time Series Analysis Python, PySpark, R MySQL, Microsoft SQL Server Power BI Azure, AWS, GCP
Noida, Gurugram, Delhi / NCR
INR 20.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Job Requirements Education: Bachelors degree (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Masters degree preferred (MBA/MS/M.Tech in Computer Science or related field) Experience: 5–7 years in a Data Science/Advanced Analytics role Behavioral Skills: Delivery Excellence Business Orientation Social Intelligence Innovation and Agility Knowledge & Technical Skills: Functional analytics experience (Supply Chain, Marketing, Customer Analytics, etc.) Statistical modeling using tools such as R, Python, KNIME Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Experience building and evaluating machine learning models MLOps tools and practices (MLflow, DVC, Docker, etc.) Strong Python programming (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.) Experience with big data technologies (AWS, Azure, GCP, Hadoop, Spark) Familiarity with relational (MySQL, SQL Server) and non-relational (MongoDB, DynamoDB) databases BI and reporting tools (Power BI, Tableau, Alteryx) Proficiency with Microsoft Office applications (especially Excel) Roles & Responsibilities Analytics & Strategy: Analyze large-scale structured and unstructured data to develop insights and machine learning models across various business domains Apply statistical and machine learning techniques to generate value from operational, financial, and customer data Recommend best-fit algorithms and models with clear justifications for business use Leverage cloud platforms for modeling and big data analysis; utilize data visualization tools to communicate results Operational Excellence: Follow industry-standard coding practices and development lifecycles Formulate hypotheses, develop analytics frameworks, and bring structure to complex problems Collaborate with Data Engineering to maintain core data infrastructure and automate analytical processes Stakeholder Engagement: Work cross-functionally with business stakeholders, engineers, and visualization experts to deliver impactful projects Communicate complex models and results to non-technical stakeholders in a clear and compelling way
Hyderabad
INR 1.0 - 1.75 Lacs P.A.
Remote
Full Time
Job Title: Senior Full Stack Developer Job Location : Remote Job Type: Contract Period - 9 months Top Skills: Python/Javascript/Python ML Frameworks/AWS/Kubernates -K8s. Familiarity with Rust/Svelte is a bonus. Job Description: We're seeking a mid-level to senior full-stack developer with strong expertise in Python and JavaScript (4-6years experience). The ideal candidate should be comfortable using modern AI tools to speed up development workflows. Key job requirements include: - Async-first Python development - Strong JavaScript skills. - Solid understanding of Python ML frameworks - Experience with AWS and Kubernetes (K8s) - Ability to balance speed with long-term maintainability - Familiarity with Rust/Svelte is a bonus. Interview Rounds: 1. AI Interview Round conducted by Zinterview. 2. Technical Round - 1 3. Technical Round with Client - 1 4. HR Round Thanks & Regards Sachin Kumar Bhosle Senior Talent Acquisition lead Riktam Technology Consulting Pvt. Ltd. Tel.+91 9966970970 (India)
FIND ON MAP
Company Reviews
View ReviewsBrowse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.