Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips – the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world – like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of science and engineering to make possible the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. We’re committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits. You’ll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers. We empower our team to push the boundaries of what is possible—while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Technical Project/Program Management About Applied Applied Materials is the leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. Our expertise in modifying materials at atomic levels and on an industrial scale enables customers to transform possibilities into reality. At Applied Materials, our innovations make possible the technology shaping the future. Key Experience Excellent communication and organizational skills are mandatory. Experience with managing multiple, complex projects x-functionally. Experience in product design life cycle, reading and interpretation of specifications and drawings, Engineering change orders, materials, special processes, manufacturing processes, engineering process and technology preferably related to semiconductor industry Demonstrated ability to drive and track projects with aggressive schedules. Seasoned in project management basics including requirements definition, scheduling, task tracking, risk management, and cross-functional communication. Experienced with project management tools, including Smartsheet, MS Teams, and SharePoint. Competency in Microsoft applications including Word, Excel, PowerPoint and Outlook Demonstrated ability to manage accountability without authority. Someone who takes initiative and be autonomous in their job role Able to commit to overseas time zones for meetings at times Familiarity with ERP systems, including SAP. Familiarity with semiconductor industry preferred. Experience on Planning/Purchasing activities preferred Qualifications: Must have bachelor’s degree in technical or related field. Minimum relevant work experience 7+ years. 7+ years in project management role (preference given to those with program management experience) Responsibilities: Utilize Global Parts and Supplier Technology (GPS&T) solution Portal, customer qual tracker and transition dashboard to manage multiple complex projects. Collaborate with Global and regional planning team to determine and control parts supply to match with customer qualification timeline. Collaborate with SSG and SBU to cut-in GPST parts at the time of new tool shipment. Coordinate with Engineering team, Purchasing, RVC, SMOD and SAM to manage FAI and Golden sample shipment process. Lead efforts to automate tasks for enhanced efficiency and productivity in project execution. Develop requirements and collaborate with business intelligence team to generate reports and dashboards. Analyze large dataset to derive insights and recommendations. Provided technical input to multifunctional team members to achieve project goals. Maintain data accuracy and integrity in GPS&T portal and qual tracker. Applied Materials is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans and Individuals with Disabilities. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 1 month ago
0 years
0 Lacs
Haryana
On-site
Assistant Manager - Data Engineer Date: Jun 20, 2025 Location: HR, IN, 122009 Requisition ID: 16338 Description: About FirstsourceFirstsource Solutions Limited, an RP-Sanjiv Goenka Group company (NSE: FSL, BSE: 532809, Reuters: FISO.BO, Bloomberg: FSOL:IN), is a specialized global business process services partner, providing transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, Retail, and other diverse industries. With an established presence in the US, the UK, India, Mexico, Australia, South Africa, and the Philippines, we make it happen for our clients, solving their biggest challenges with hyper-focused, domain-centered teams and cutting-edge tech, data, and analytics. Our real-world practitioners work collaboratively to deliver future-focused outcomes. Key Responsibilities: The Data Engineering Professional supports the capture, management, storage, and utilization of structured and unstructured data from internal and external sources, turning business needs into the data that supports the data strategic decision making Supports the design, coding, unit testing and deployment of data processes for ingestion, transformation or curation of big data while keeping data security and privacy in mind Supports the development of architecture and design patterns to process and store high volume data sets Supports the delivery of a range of projects driving new solutions to large, open-ended problems Delivers and generates business relevant, actionable insights in support of the data engineering specialism Follow best practises and delivers in an Agile process that consistently delivers a quality product for the organization Supports and assists in the execution of initiatives focused on the development of data and analytic infrastructure for product development Supports the implementation of ways to improve working processes within the area of data engineering responsibility Owns contribution to team and data governance processes, policies & regulations. Follows best practises and agile methodology, owning sprint goals and participating in sprint activities and governance Key Skills: Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualisation - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills: Cloud - GCP - Expert Database systems (SQL and NoSQL / Big Query / DBMS) - Expert Data warehousing solutions - Advanced ETL Tools - Advanced Data APIs - Advanced Python, Java, and Scala etc - Intermediate Some knowledge understanding the basics of distributed systems - Foundation Some knowledge of algorithms and optimal data structures for analytics - Foundation Soft Skills and time management skills - Foundation ️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.
Posted 1 month ago
1.5 - 2.0 years
3 - 6 Lacs
Gurgaon
On-site
1. Dataiku exp of at least 1.5-2 years. Good to know on creating and handling partitioned dataset in Dataiku. 2. Good with Python, data handling using pandas, numpy (Pandas and numpy are must and must know it in depth) and basics of regex. 3. Should be able to work on GCP big query and use terraform as base for managing the code changes. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 years
3 - 6 Lacs
Vadodara
Remote
We’re reinventing the market research industry. Let’s reinvent it together. At Numerator, we believe tomorrow’s success starts with today’s market intelligence. We empower the world’s leading brands and retailers with unmatched insights into consumer behavior and the influencers that drive it. As a Custom Group Associate on Numerator’s Consulting Team, you will be responsible for curating highly specialized retail data tailored to client needs primarily based in the U.S. and Canada. Your expertise in brands, categories, and data sources will help shape customized datasets that drive impactful business insights. We seek individuals with a keen analytical mindset, strong attention to detail, and a collaborative spirit. While training is provided, prior experience with data sources and familiarity with Custom Data work responsibilities is highly valued. Adaptability and problem-solving skills are essential for success in this role. What You’ll Do: Work on high-profile projects, ensuring accuracy, granularity, and timely dataset delivery. Conduct market research to verify product attributions and enhance data accuracy. Apply advanced technical skills, such as item tagging and conditional logic grouping. Partner with cross-functional teams (Consulting, Engineering, Data Services) to refine processes and address client-specific data needs. Develop expertise in data sources to improve dataset precision and efficiency. Manage multiple projects with strong organizational skills, initiative, and attention to detail. Communicate clearly and effectively in both written and verbal formats. Proficiency in Binder is an advantage. If you are a detail-oriented problem solver who thrives in a data-driven environment, this role offers the opportunity to shape customized insights that help Numerator’s clients make smarter business decisions. There is strength in numbers - We are the Numerati Numerator is 2,000 employees strong. We have the confidence to be real and embrace what makes each Numerati unique. Our diverse experiences, ideas and backgrounds fuel our innovation. Being part of the Numerati means that we’ll take care of you! From our Recharge Days, maximum flexibility policy, wellness resources for employees and their families, development opportunities and much more — we’re always finding ways to better support, celebrate and accelerate our team. What You Can Expect from Your Intern Experience You’ll play an active role as a member of a dynamic team of supportive and highly motivated employees and leaders. From day one, you’ll be set up for success with your NuIntern buddy, who will be a key partner throughout your internship. Numerator’s onboarding program will introduce you to your new colleagues, immerse you into our culture, and provide you with resources to thrive. Expect to make an impact on real projects, business challenges, clients, and our global teams. Interact and engage with colleagues in person at our cool headquarters, designed to further inspire innovation, creativity, and collaboration. You'll also have the opportunity to participate in local events, Hack Days, networking, and workshops. Internship dates: June 9th to August 8th, 2025 Skills Desired Bachelor's degree (any field) with a passion for marketing, research, and account management or growth Proficient in Excel and PowerPoint (or equivalent) Previous experience with SQL is a plus Excellent oral and written communication skills Analytical problem-solving skills and strategic thinking Experience in FMCG or working with large data sets is a plus Flexible, can-do spirit. We are an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by the law. While some roles can be remote, Numerator is only able to hire in many, but not all, states and provinces. In certain cases, if you are not located in a specific area where Numerator is able to hire, you may not be eligible for employment .
Posted 1 month ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
About FirstsourceFirstsource Solutions Limited, an RP-Sanjiv Goenka Group company (NSE: FSL, BSE: 532809, Reuters: FISO.BO, Bloomberg: FSOL:IN), is a specialized global business process services partner, providing transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, Retail, and other diverse industries. With an established presence in the US, the UK, India, Mexico, Australia, South Africa, and the Philippines, we make it happen for our clients, solving their biggest challenges with hyper-focused, domain-centered teams and cutting-edge tech, data, and analytics. Our real-world practitioners work collaboratively to deliver future-focused outcomes. Key Responsibilities The Data Engineering Professional supports the capture, management, storage, and utilization of structured and unstructured data from internal and external sources, turning business needs into the data that supports the data strategic decision making Supports the design, coding, unit testing and deployment of data processes for ingestion, transformation or curation of big data while keeping data security and privacy in mind Supports the development of architecture and design patterns to process and store high volume data sets Supports the delivery of a range of projects driving new solutions to large, open-ended problems Delivers and generates business relevant, actionable insights in support of the data engineering specialism Follow best practises and delivers in an Agile process that consistently delivers a quality product for the organization Supports and assists in the execution of initiatives focused on the development of data and analytic infrastructure for product development Supports the implementation of ways to improve working processes within the area of data engineering responsibility Owns contribution to team and data governance processes, policies & regulations. Follows best practises and agile methodology, owning sprint goals and participating in sprint activities and governance Key Skills Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualisation - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills Cloud - GCP - Expert Database systems (SQL and NoSQL / Big Query / DBMS) - Expert Data warehousing solutions - Advanced ETL Tools - Advanced Data APIs - Advanced Python, Java, and Scala etc - Intermediate Some knowledge understanding the basics of distributed systems - Foundation Some knowledge of algorithms and optimal data structures for analytics - Foundation Soft Skills and time management skills - Foundation ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.
Posted 1 month ago
3.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You’ll Do This position reports to Director, Portfolio Strategy & Analytics and is part of the COE function providing financial analysis, data analytics, reporting, and transaction support on key GRE projects and initiatives. Job Responsibilities Execute all GRE reporting, dashboard or other reporting tools such as Power BI to track key real estate portfolio metrics and monitor performance. Conduct monthly quality checks aligned to location governance to ensure data accuracy and integrity Maintain and update real estate database (CoStar) to ensure timely and accurate entry of non-lease data; assign and manage location IDs for newly acquired or established properties Assist with design and implementation of set of standardized GRE templates and tools for stakeholder discussions and presentations as well as with preparation of playbooks on best practices and standard operating procedures for all GRE related activities Support sustainability reporting for real estate portfolio for ESG disclosures and compliance initiatives Administer and maintain GRE MS team site as the centralized depository and key mode of distribution channel for all key real estate data, analytics, playbooks, tools, and all other related materials Administer and maintain Real Estate inbox and calendar as key centralized communication channel with internal and external GRE stakeholders Manage any vendor or landlord payment and/or any other property/lease specific issues with timely coordination and communications among internal stakeholders Assist GRE Managers and key internal customers with preparation and completion of support for Capital Appropriation Request(s) Work on any ad-hoc special projects as assigned Qualifications Bachelor's Degree in business, finance or related field required 3 - 4 years proven experience in Corporate Real Estate, Finance or Data Analysis. Skills Real estate practices, familiarity with real estate finance, accounting, and legal concepts Skilled in MS Excel and ability to analyze and synthesize large amount of raw dataset and turn them into meaningful analysis. Tech, digital mindset Experienced in Power Point and other data presentation tool such as Power BI Strong analytical, reasoning, organization, and problem-solving skills Ability to multi-task and work well under deadlines Must have strong written, verbal and communication skills Requires the ability to be flexible, adaptable and to deal with ambiguity and change. ]]>
Posted 1 month ago
1.5 - 2.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Dataiku exp of at least 1.5-2 years. Good to know on creating and handling partitioned dataset in Dataiku. Good with Python, data handling using pandas, numpy (Pandas and numpy are must and must know it in depth) and basics of regex. Should be able to work on GCP big query and use terraform as base for managing the code changes.
Posted 1 month ago
0 years
0 Lacs
Vadodara, Gujarat, India
On-site
We’re reinventing the market research industry. Let’s reinvent it together. At Numerator, we believe tomorrow’s success starts with today’s market intelligence. We empower the world’s leading brands and retailers with unmatched insights into consumer behavior and the influencers that drive it. As a Custom Group Associate on Numerator’s Consulting Team, you will be responsible for curating highly specialized retail data tailored to client needs primarily based in the U.S. and Canada. Your expertise in brands, categories, and data sources will help shape customized datasets that drive impactful business insights. We seek individuals with a keen analytical mindset, strong attention to detail, and a collaborative spirit. While training is provided, prior experience with data sources and familiarity with Custom Data work responsibilities is highly valued. Adaptability and problem-solving skills are essential for success in this role. What You’ll Do: Work on high-profile projects, ensuring accuracy, granularity, and timely dataset delivery. Conduct market research to verify product attributions and enhance data accuracy. Apply advanced technical skills, such as item tagging and conditional logic grouping. Partner with cross-functional teams (Consulting, Engineering, Data Services) to refine processes and address client-specific data needs. Develop expertise in data sources to improve dataset precision and efficiency. Manage multiple projects with strong organizational skills, initiative, and attention to detail. Communicate clearly and effectively in both written and verbal formats. Proficiency in Binder is an advantage. If you are a detail-oriented problem solver who thrives in a data-driven environment, this role offers the opportunity to shape customized insights that help Numerator’s clients make smarter business decisions. What You'll Bring to Numerator Skills Desired Bachelor's degree (any field) with a passion for marketing, research, and account management or growth Proficient in Excel and PowerPoint (or equivalent) Previous experience with SQL is a plus Excellent oral and written communication skills Analytical problem-solving skills and strategic thinking Experience in FMCG or working with large data sets is a plus Flexible, can-do spirit.
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Title :Backup Admin L2+ Location : Noida Required Good hands on Experience in NetBackup, Backup Exec & Barracuda Tools NetBackup Media\Master & Client up-gradation , Backup Exec server & client upgradation NetBackup & Backup Exec backup failure troubleshooting Expertise on Performance backup tuning and understand the tuning Parameters. Server/Client installation on different platforms (Unix/Windows/Linux). Create and Manage Dataset, Schedule and Retention Policy. Good knowledge on DR and restoring backups based on requirement. Configuring Events, Notifications Understanding of Duplication & Replication of backup via SLP in NetBackup. Knowledge on housekeeping to perform the Disk pool utilization below threshold Ensure integration and compatibility among various backup tools. Monitor backup environments to ensure the health and performance of backup jobs. Generate regular reports on backup status, capacity planning, and SLA adherence. Documentation and Compliance: Maintain detailed documentation of backup configurations, processes, and procedures. Ensure compliance with data protection regulations and company policies. Technical Support and Troubleshooting: Provide L2 support for complex backup and recovery issues. Work closely with L1 and L2 support teams to resolve escalated issues. Training and Mentorship: Mentor junior team members and provide training on backup and recovery best practices. Experience: Minimum of 5+ years of experience in backup and recovery, with a focus on Veritas NetBackup. Proficiency in Backup Exec, NetBackup, Barracuda backup tools. Certifications: Relevant certifications (e.g., Veritas Certified Specialist, VMware Certified Professional) are preferred.
Posted 1 month ago
7.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Number of Positions 1 Industry IT Services Date Opened 06/20/2025 Job Type Permanent Work Experience 7+ Years Required Skills Azue Azure devOps +4 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600096 Location Chennai About Us CloudifyOps is a company with DevOps and Cloud in our DNA. CloudifyOps enables businesses to become more agile and innovative through a comprehensive portfolio of services that addresses hybrid IT transformation, Cloud transformation, and end-to-end DevOps Workflows. We are a proud Advance Partner of Amazon Web Services and have deep expertise in Microsoft Azure and Google Cloud Platform solutions. We are passionate about what we do. The novelty and the excitement of helping our customers accomplish their goals drives us to become excellent at what we do. Job Description Lead Azure DevOps Engineer: We are looking for a Lead Azure DevOps Engineer with deep hands-on experience in cloud infrastructure, DevOps best practices, and application integration on Microsoft Azure. The ideal candidate will have a strong background in Terraform, ARM, Bicep, and development experience in TypeScript, JavaScript, and .NET. This individual will play a key role in enhancing our Data Ingestion & Storage system, supporting integration with internal platforms, and leading DevOps-related initiatives. What will you do : Lead DevOps strategy and delivery on Azure for data-related infrastructure and platform integration. Design and implement Infrastructure-as-Code (IaC) using Terraform, ARM templates, and Bicep. Build a Computer System Validated (CSV) interface to Roche OneArchive to support the migration of large-scale datasets (several TBs). Develop and automate data consistency checks post-ingestion to ensure data integrity and compliance. Manage data deletion workflows in accordance with established Standard Operating Procedures (SOPs). Design and support a lightweight integration with OneDBM (Caspian platform)—primarily an Azure coordination piece with AWS collaboration. Develop scripts and utilities using TypeScript, JavaScript, and .NET/C# to support automation and integration. Collaborate with cross-functional teams including development, data engineering, and AWS infrastructure teams. Provide leadership and mentoring to DevOps and cloud engineering peers. Requirements What we are looking for : 8+ years of experience working with Microsoft Azure, including production-level architecture, automation, and deployment. Deep experience with Azure DevOps (pipelines, repos, artifacts, and release management). Strong experience with Infrastructure-as-Code (IaC) using Terraform, ARM templates, and Bicep. Programming/scripting expertise in TypeScript, JavaScript, and .NET (C#). Hands-on experience with CSV-compliant systems in regulated industries (e.g., life sciences, pharma). Familiarity with data handling, especially large dataset migration, validation, and automated deletion protocols. Kubernetes expertise is a must have and knowledge on Service Mesh, Tracing, Helm Charts etc would be an added advantage. Understanding of hybrid cloud setups, with some exposure to AWS helpful for platform integration. Good presentation and communications skills. Finally, and most importantly to lead and grow your team by example
Posted 1 month ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role As a Senior Software Engineer on the Cloud-Lake team, you will play a critical role in driving Uber's batch data infrastructure to the cloud. You'll be responsible for building scalable, reliable systems that automate dataset replication, orchestrate workload migrations, and ensure data integrity and performance across hybrid environments. You will collaborate with infra, platform, and product teams to migrate hundreds of PBs of data and thousands of pipelines, minimizing customer impact and ensuring strong observability and resilience during the transition. This role is central to delivering on Uber's long-term cost, performance, and scalability goals. What The Candidate Will Need / Bonus Points ---- What the Candidate Will Do ---- Lead design and development of critical migration components like dataset replication, workload redirection, and metadata reconciliation. Own key modules such as state tracking, observability tooling, rollback workflows, or migration planners. Collaborate with infra, data platform, and product teams to define migration strategies, create scalable solutions, and align on delivery timelines. Proactively identify gaps in current migration tooling, propose improvements, and drive execution. Work closely with stakeholders to ensure seamless migration of workloads, accurate lineage mapping, and minimal customer disruption. Take ownership of production reliability, implement alerting for silent failures, and drive initiatives for automatic anomaly detection. Represent the team in architecture reviews, technical deep-dives, and operational postmortems. Basic Qualifications 8+ years of software engineering experience, including backend development in Java, Go, or Python . Strong understanding of distributed systems , data processing frameworks (e.g., Spark, Hive, Presto) , and cloud-native services (e.g., GCS, S3, BigQuery) . Proven experience designing and operating fault-tolerant , scalable systems in production. Proficiency with batch job orchestration tools (e.g., Airflow, Piper) and monitoring/observability best practices. Experience working with large-scale data systems , including large scale upgrades, storage optimisations and handling consistency / availability challenges Strong debugging skills, ownership mindset, and the ability to work across team boundaries. Preferred Qualifications Bachelors (or Masters) in Compute Science Experience leading projects that span multiple teams and domains. Prior exposure to cloud migration initiatives or hybrid cloud/on-prem transitions. Knowledge of metadata management, data lineage, and data governance systems. Experience in building internal platforms or tooling to improve engineering productivity and reduce operational burden. Strong communication skills and a history of mentoring or guiding junior engineers.
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS! TCS IS HIRING FOR Azure Data Engineer Technical Skill Set - Pyspark, Azure Data Factory, Azure Data Bricks. Desired Experience Range 5-8 years Required Competencies: 1) Strong design and data solutioning skills 2) Pyspark hands-on experience with complex transformations and large dataset handling experience 3) Good command and hands-on experience in Python. Experience working with following concepts, packages, and tools, 4) Azure Skills a. Must have working experience in Azure Data Lake, Azure Data Factory, Azure Databricks, Azure SQL Databases b. Azure DevOps 6) Database skills a. Oracle, Postgres, SQL Server – any one database experience b. Oracle PL/SQL or T-SQL experience Data modelling
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS! TCS IS HIRING FOR Azure Data Engineer Technical Skill Set - Pyspark, Azure Data Factory, Azure Data Bricks. Desired Experience Range 4-8 years Required Competencies: 1) Strong design and data solutioning skills 2) Pyspark hands-on experience with complex transformations and large dataset handling experience 3) Good command and hands-on experience in Python. Experience working with following concepts, packages, and tools, a. Object oriented and functional programming b. NumPy, Pandas, Matplotlib, requests, pytest c. Jupyter, PyCharm and IDLE d. Conda and Virtual Environment 4) Working experience must with Hive, HBase or similar 5) Azure Skills a. Must have working experience in Azure Data Lake, Azure Data Factory, Azure Databricks, Azure SQL Databases b. Azure DevOps c. Azure AD Integration, Service Principal, Pass-thru login etc. d. Networking – vnet, private links, service connections, etc. e. Integrations – Event grid, Service Bus etc. 6) Database skills a. Oracle, Postgres, SQL Server – any one database experience b. Oracle PL/SQL or T-SQL experience Data modelling
Posted 1 month ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Sprinklr was founded in 2009 to solve a big problem: enterprise-size complexity dividing brands from the customers they serve. The idea was to unify silos, technology, and teams across large, complex companies. What started in social expanded into a single AI platform to reach, engage, and listen to customers on more than 30 digital channels. Today, Sprinklr has an AI-based platform for four product suites: Sprinklr Service, Sprinklr Social, Sprinklr Marketing and Sprinklr Insights. And as customer-facing teams, markets, and geographies work together, brands benefit from a unified digital edge. At Sprinklr, the culture is built around pushing the limits of possibility. Social media today has changed the world, with each of us playing a role in the future. The Company strives to delight customers by going above and beyond as a trusted social guide and helping reimagine enterprise software for the better, with a core belief in the Sprinklr Way. Job Title: Director of Machine Learning Research Location: Gurugram Responsibilities: Set Research Direction: Define and execute a forward-looking ML research agenda aligned with company strategy and business objectives and technological innovation. Drive Key Research in relevant areas of Sprinklr: Self-learning AI Agents, Auto Evaluation of AI Agents, Taxonomy discovery and quality, End-to-end voice models, Multi-linguality, Multi-modality, Domain-Specific finetuning and alignment, etc. Advance the State of the Art: Guide research initiatives in areas such as deep learning, generative models, NLP, and reinforcement learning. Encourage and support filing patents and publications in top-tier venues (NeurIPS, ICML, ACL, CVPR, IEEE etc.). Bridge Research and Product: Collaborate with engineering and product teams to transition research into real-world applications. Champion best practices in experimentation, reproducibility, and scalability. Lead and Grow the Team: Mentor and manage a high-performing team of ML researchers and engineers. Foster a culture of curiosity, rigor, and excellence. Act as a Thought Leader: Stay ahead of emerging trends and shape the company's position in the global AI landscape. Evangelism : Serve as a subject matter expert internally and externally. Represent the company in academic and industry events, talks, and panels. Qualifications: Deep expertise in modern ML techniques, especially large-scale learning, generative models, or foundational models. Experienced with synthetic dataset generation for production and quantization. PhD in Computer Science, Machine Learning, or a related field. 8+ years of experience in ML/AI, including 4+ years in a leadership role. Proven track record of impactful research contributions, including publications, patents, or open-source work. Strong knowledge of cloud platform technologies and MLops tools such as CUDA, K8s, Docker, PyTorch, TensorRT, etc. Strong leadership and communication skills, with experience managing senior researchers and cross-functional collaboration. Ability to align research investments with business strategy and measurable outcomes.
Posted 1 month ago
0 years
6 - 9 Lacs
Noida
On-site
Position General Duties and Tasks: Participate in research, design, implementation, and optimization of Machine learning Models Help AI product managers and business stakeholders understand the potential and limitations of AI when planning new products Understanding of Revenue Cycle Management processes like Claims filing and adjudication Hands on experience in Python Build data ingest and data transformation platform Identify transfer learning opportunities and new training datasets Build AI models from scratch and help product managers and stakeholders understand results Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world Verifying data quality, and/or ensuring it via data cleaning Supervising the data acquisition process if more data is needed Defining validation strategies Defining the pre-processing or feature engineering to be done on a given dataset Training models and tuning their hyperparameters Analysing the errors of the model and designing strategies to overcome them Deploying models to production Create APIs and help business customers put results of your AI models into operations JD Education Bachelor's in computer sciences or similar. Skills hands on programming experience working on enterprise products Demonstrated proficiency in multiple programming languages with a strong foundation in a statistical platform such as Python, R, SAS, or MatLab. Knowledge in Deep Learning/Machine learning, Artificial Intelligence Experience in building AI models using algorithms of Classification & Clustering techniques Expertise in visualizing and manipulating big datasets Strong in MS SQL Acumen to take a complex problem and break it down to workable pieces, to code a solution Excellent verbal and written communication skills Ability to work in and define a fast pace and team focused environment Proven record of delivering and completing assigned projects and initiatives Ability to deploy large scale solutions to an enterprise estate Strong interpersonal skills Understanding of Revenue Cycle Management processes like Claims filing and adjudication is a plus
Posted 1 month ago
0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Job Title: Power BI SME · Years of Exp. required: 10 Yrs of experience · Project Experience : He/she must have delivered 3-5 projects independently · No of resources required: 1 · Duration: 3-6 months+ Location: Remote Mandatory Skillset and Responsibilities : Lead the design, development, and deployment of Power BI dashboards and reports to visualize KPIs, trends, and business insights. Collaborate with business users, analysts, and data engineers to gather requirements, perform data analysis, and understand reporting needs. Develop data models using Power BI (star/snowflake schemas, normalized structures) and ensure performance optimization. Create DAX measures and calculated columns for advanced analytics. Handle Power BI Service administration, including workspace management, dataset refresh schedules, row-level security (RLS), and deployment pipelines. Act as a technical advisor for Power BI best practices and help establish governance frameworks. Collaborate with cross-functional teams to integrate data from multiple sources, including SQL Server, Azure, Excel, SharePoint, and APIs. Conduct code reviews, UAT sessions, and provide end-user training/documentation. Stay up to date with new Power BI features and industry trends to bring innovation to client engagements. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
ECI is the leading global provider of managed services, cybersecurity, and business transformation for mid-market financial services organizations across the globe. From its unmatched range of services, ECI provides stability, security and improved business performance, freeing clients from technology concerns and enabling them to focus on running their businesses. More than 1,000 customers worldwide with over $3 trillion of assets under management put their trust in ECI. At ECI, we believe success is driven by passion and purpose. Our passion for technology is only surpassed by our commitment to empowering our employees around the world . The Opportunity: ECI has an exciting opportunity for an experienced Data Analyst, The Data Analyst will be responsible for providing analytics to a wide range of functional teams at ECI by overseeing the development and deployment of Power BI reports and underlying datasets. This person will lead projects to bring and enhance visibility into key enterprise metrics and drive action based on the resulting insights. The successful candidate should be skilled at asking probing questions for clarity and be able to explain difficult concepts to non-technical audiences. This role requires expert proficiency in writing complex SQL queries that include KPIs critical to company objectives. This is an Onsite role. What you will do: Design, develop and maintain dashboards and interactive reports using Power BI. Responsible for interfacing directly with stakeholders to efficiently define requirements and support their business objectives. Leverage SQL expertise to produce datasets that align with business objectives. Configure reporting library to be run on automated refresh schedules using gateways. Define and design new systems by analyzing ETL processes. Convert business requirements into technical specifications to determine level of effort. Use filters and graphs to provide a better understanding of report data. Coordinate and assist with user acceptance testing, conduct demos, participate in stakeholder reviews, as well as assume the role of post-release subject matter expert. Be accountable and adhere to a high standard of ensuring accurate, consistent and reliable data; Monitor dataset refresh schedules and ensuring no failures. Maintain SharePoint knowledge base documentation related to solution requirements, data flow diagrams, business glossary and data lineage. Create scheduled jobs that create snapshots required for trending analysis. Collaborate with team Reporting product owner and business analyst to help define roadmap. Who you are: Minimum 5+ years writing complex queries in SQL Minimum 5+ years building reporting solutions using Power BI Skilled in developing a comprehensive portfolio of reporting tools that encompass a broad spectrum of metrics Experience applying statistical techniques to understand and quantify the relationships between metrics to help differentiate between correlation and causation. Experience navigating large datasets Proven experience in writing queries and constructing reports that highlight key performance indicators (KPIs) for the company. Experience applying statistical techniques to understand and quantify the relationships between metrics to help differentiate between correlation and causation Demonstrated ability to perform in fast-paced environment with a focus on a superior customer experience Proactive approach to workload with a strong attention to detail and ability to perform self QA of completed work Proficient in working both independently and as part of a team Experience with reporting from ServiceNow, Salesforce, or accounting databases is a plus Education/Experience: College diploma or university degree in the field of computer science, information science, management information systems, or business administration or a related field (or equal years of experience) Data Analysis: Demonstrated ability to analyze extensive datasets and derive significant insights. Statistics and Databases: Firm understanding of statistics and databases to be able work with data effectively. Communication: Strong communication skills to be able to explain complex data insights to non-technical stakeholders. Collaboration: Ability to work effectively with cross-functional teams and collaborate with other departments to achieve common goals. Change Management: Demonstrated expertise in adhering to change management and version control best practices, with proficiency in deploying reports from development to production environments. Dedicated to innovation of existing operational processes. Ability to effectively prioritize and negotiate tradeoffs with stakeholders. Proficiency with ETL methodologies & syncing data across disparate systems. ECI’s culture is all about connection - connection with our clients, our technology and most importantly with each other. In addition to working with an amazing team around the world, ECI offers a competitive compensation package and so much more! If you believe you’d be a great fit and are ready for your best job ever, we’d like to hear from you! Love Your Job, Share Your Technology Passion, Create Your Future Here! Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position General Duties and Tasks: Participate in research, design, implementation, and optimization of Machine learning Models Help AI product managers and business stakeholders understand the potential and limitations of AI when planning new products Understanding of Revenue Cycle Management processes like Claims filing and adjudication Hands on experience in Python Build data ingest and data transformation platform Identify transfer learning opportunities and new training datasets Build AI models from scratch and help product managers and stakeholders understand results Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world Verifying data quality, and/or ensuring it via data cleaning Supervising the data acquisition process if more data is needed Defining validation strategies Defining the pre-processing or feature engineering to be done on a given dataset Training models and tuning their hyperparameters Analysing the errors of the model and designing strategies to overcome them Deploying models to production Create APIs and help business customers put results of your AI models into operations JD Education Bachelor’s in computer sciences or similar. Skills hands on programming experience working on enterprise products Demonstrated proficiency in multiple programming languages with a strong foundation in a statistical platform such as Python, R, SAS, or MatLab. Knowledge in Deep Learning/Machine learning, Artificial Intelligence Experience in building AI models using algorithms of Classification & Clustering techniques Expertise in visualizing and manipulating big datasets Strong in MS SQL Acumen to take a complex problem and break it down to workable pieces, to code a solution Excellent verbal and written communication skills Ability to work in and define a fast pace and team focused environment Proven record of delivering and completing assigned projects and initiatives Ability to deploy large scale solutions to an enterprise estate Strong interpersonal skills Understanding of Revenue Cycle Management processes like Claims filing and adjudication is a plus Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
1. Job Title: Senior Azure Engineer (Azure Platform Operations & Automation) Experience: 5–7 years Location: Onsite/Remote (Noida) Reports To: Technical Manager / Architect Budget: Max. 12 LPA Responsibilities: · Manage and troubleshoot ADF and Databricks workflows, ensuring triggers, linked services, parameters, and pipelines function correctly end-to-end. · Investigate and resolve complex job failures; debug Spark jobs, and analyze notebook execution graphs and logs. · Lead performance optimization for ADF pipelines, partitioning strategies, and ADLS data formats (e.g., Parquet tuning). · Execute and automate data pipeline deployment using Azure DevOps, ARM templates, PowerShell scripts, and Git repositories. · Govern data lifecycle rules, partition retention, and enforce consistency across raw/curated zones in ADLS. · Monitor resource consumption (clusters, storage, pipelines) and advise on cost-saving measures (auto-scaling, tiering, concurrency). · Prepare RCA for P1/P2 incidents and support change deployment validation, rollback strategy, and UAT coordination. · Review Power BI refresh bottlenecks, support L1 Power BI developer with dataset tuning and refresh scheduling improvements. · Validate SOPs and support documentation prepared by L1s, and drive process improvement via automation or standardization. Required Skills · Expert in Azure Data Factory, Databricks (PySpark), Azure Data Lake Storage, Synapse. · Proficient in Python, PySpark, SQL/SparkSQL, and JSON configurations. · Familiar with Azure DevOps, Git for version control, and CI/CD automation. · Hands-on with monitoring (Azure Monitor), diagnostics, and cost governance. · Strong understanding of data security practices, IAM, RBAC, and audit trail enforcement Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Overview: Attero Recycling Private Limited is a NASA-recognized metal extraction company and end-to-end recycler of Li-Ion Batteries and E-waste headquartered in Noida and a manufacturing facility in Roorkee, Uttarakhand. Attero Recycling Private Limited is amongst a handful of elite organizations globally, with the capability to extract pure metals like Lithium, Cobalt, Titanium, Nickle, Manganese, Graphite, Gold, Copper, Palladium, etc from end-of-life electronics and Lithium-ion batteries. The company is now in process of global expansion and setting up operations in India, Europe, and North America. Given the pace at which the company wants to grow, it expects employees to go beyond their defined roles to accomplish results, cooperate and collaborate with other team members, and are willing to apply innovation, and new ideas and take calculated risks like an entrepreneur. Position: Data Analyst Location: Noida Experience: 3–5 years of data analysis experience in a product-based company Job Summary: As a Data Analyst, you will play a pivotal role in analyzing data from various departments. Your work will involve collecting, analyzing, and interpreting data to optimize processes, inform decision-making, and drive the profitability of our operations. Key Responsibilities: Data Collection and Validation Gather and meticulously process data from diverse sources, ensuring its quality and accuracy. Data Integration Combine data from disparate sources to construct a comprehensive dataset for in-depth analysis. Data Analysis Scrutinize extensive datasets to uncover trends, patterns, and valuable insights. Employ statistical analysis and data mining techniques to extract meaningful information. Market Analysis Examine purchase and sales data to detect trends, pricing trends, and market opportunities. Quantitative Assessment Conduct quantitative analyses to evaluate profitability, identify cost-saving measures, and forecast demand. Decision Support Provide critical support to various departments, particularly in feedstock sourcing and metal sales, to facilitate well-informed decision-making. Inventory Management Monitor and oversee inventory levels, ensuring precision and timely reporting. Inventory Optimization Develop strategies to enhance inventory turnover and minimize wastage. Pricing Strategy Collaborate with sourcing and sales teams to formulate and refine pricing strategies based on thorough market analysis and competitor benchmarking. Sales Process Enhancement Identify opportunities for improving the sales process, including customer segmentation and targeting. Campaign Evaluation Assess the effectiveness of sales campaigns and initiatives. Data Visualization Generate visual reports, charts, and dashboards to effectively communicate data-driven insights to management and stakeholders. Data Governance Ensure data accuracy, consistency, and adherence to data governance policies. Data Validation Implement data validation and cleansing processes as necessary. Reporting Prepare routine reports and ad-hoc analyses to support decision-making processes throughout the organization. Qualifications: Bachelor's degree in a relevant field (e.g., Data Science, Computer Science, B.tech is a must. A Master's degree is a plus. Strong analytical and quantitative skills, with proficiency in data analysis tools such as Python, Excel, SQL, and data visualization tools. Excellent problem-solving skills and attention to detail. Experience in the metals industry is a plus. Familiarity with inventory management principles. Excellent communication and presentation skills. Attention to detail and the ability to work independently. Knowledge of data governance and privacy regulations (e.g., GDPR) is beneficial. Benefits: Competitive salary and comprehensive benefits package. Opportunities for professional development and advancement within the company. Collaborative and inclusive work environment that values diversity and innovation. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Client Type: US Client Location: Remote About the Role We’re creating a new certification: Google AI Ecosystem Architect (Gemini & DeepMind) - Subject Matter Expert . This course is designed for technical learners who want to understand and apply the capabilities of Google’s Gemini models and DeepMind technologies to build powerful, multimodal AI applications. We’re looking for a Subject Matter Expert (SME) who can help shape this course from the ground up. You’ll work closely with a team of learning experience designers, writers, and other collaborators to ensure the course is technically accurate, industry-relevant, and instructionally sound. Responsibilities As the SME, you’ll partner with learning experience designers and content developers to: Translate real-world Gemini and DeepMind applications into accessible, hands-on learning for technical professionals. Guide the creation of labs and projects that allow learners to build pipelines for image-text fusion, deploy Gemini APIs, and experiment with DeepMind’s reinforcement learning libraries. Contribute technical depth across activities, from high-level course structure down to example code, diagrams, voiceover scripts, and data pipelines. Ensure all content reflects current, accurate usage of Google’s multimodal tools and services. Be available during U.S. business hours to support project milestones, reviews, and content feedback. This role is an excellent fit for professionals with deep experience in AI/ML, Google Cloud, and a strong familiarity with multimodal systems and the DeepMind ecosystem. Essential Tools & Platforms A successful SME in this role will demonstrate fluency and hands-on experience with the following: Google Cloud Platform (GCP) Vertex AI (particularly Gemini integration, model tuning, and multimodal deployment) Cloud Functions, Cloud Run (for inference endpoints) BigQuery and Cloud Storage (for handling large image-text datasets) AI Platform Notebooks or Colab Pro Google DeepMind Technologies JAX and Haiku (for neural network modeling and research-grade experimentation) DeepMind Control Suite or DeepMind Lab (for reinforcement learning demonstrations) RLax or TF-Agents (for building and modifying RL pipelines) AI/ML & Multimodal Tooling Gemini APIs and SDKs (image-text fusion, prompt engineering, output formatting) TensorFlow 2.x and PyTorch (for model interoperability) Label Studio, Cloud Vision API (for annotation and image-text preprocessing) Data Science & MLOps DVC or MLflow (for dataset and model versioning) Apache Beam or Dataflow (for processing multimodal input streams) TensorBoard or Weights & Biases (for visualization) Content Authoring & Collaboration GitHub or Cloud Source Repositories Google Docs, Sheets, Slides Screen recording tools like Loom or OBS Studio Required skills and experience: Demonstrated hands-on experience building, deploying, and maintaining sophisticated AI powered applications using Gemini APIs/SDKs within the Google Cloud ecosystem, especially in Firebase Studio and VS Code. Proficiency in designing and implementing agent-like application patterns, including multi-turn conversational flows, state management, and complex prompting strategies (e.g., Chain-of Thought, few-shot, zero-shot). Experience integrating Gemini with Google Cloud services (Firestore, Cloud Functions, App Hosting) and external APIs for robust, production-ready solutions. Proven ability to engineer applications that process, integrate, and generate content across multiple modalities (text, images, audio, video, code) using Gemini’s native multimodal capabilities. Skilled in building and orchestrating pipelines for multimodal data handling, synchronization, and complex interaction patterns within application logic. Experience designing and implementing production-grade RAG systems, including integration with vector databases (e.g., Pinecone, ChromaDB) and engineering data pipelines for indexing and retrieval. Ability to manage agent state, memory, and persistence for multi-turn and long-running interactions. Proficiency leveraging AI-assisted coding features in Firebase Studio (chat, inline code, command execution) and using App Prototyping agents or frameworks like Genkit for rapid prototyping and structuring agentic logic. Strong command of modern development workflows, including Git/GitHub, code reviews, and collaborative development practices. Experience designing scalable, fault-tolerant deployment architectures for multimodal and agentic AI applications using Firebase App Hosting, Cloud Run, or similar serverless/cloud platforms. Advanced MLOps skills, including monitoring, logging, alerting, and versioning for generative AI systems and agents. Deep understanding of security best practices: prompt injection mitigation (across modalities), secure API key management, authentication/authorization, and data privacy. Demonstrated ability to engineer for responsible AI, including bias detection, fairness, transparency, and implementation of safety mechanisms in agentic and multimodal applications. Experience addressing ethical challenges in the deployment and operation of advanced AI systems. Proven success designing, reviewing, and delivering advanced, project-based curriculum and hands-on labs for experienced software developers and engineers. Ability to translate complex engineering concepts (RAG, multimodal integration, agentic patterns, MLOps, security, responsible AI) into clear, actionable learning materials and real world projects. 5+ years of professional experience in AI-powered application development, with a focus on generative and multimodal AI. Strong programming skills in Python and JavaScript/TypeScript; experience with modern frameworks and cloud-native development. Bachelor’s or Master’s degree in Computer Science, Data Engineering, AI, or a related technical field. Ability to explain advanced technical concepts (e.g., fusion transformers, multimodal embeddings, RAG workflows) to learners in an accessible way. Strong programming experience in Python and experience deploying machine learning pipelines Ability to work independently, take ownership of deliverables, and collaborate closely with designers and project managers Preferred: Experience with Google DeepMind tools (JAX, Haiku, RLax, DeepMind Control Suite/Lab) and reinforcement learning pipelines. Familiarity with open data formats (Delta, Parquet, Iceberg) and scalable data engineering practices. Prior contributions to open-source AI projects or technical community engagement. Show more Show less
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Senior Azure Engineer (Azure Platform Operations & Automation) Experience: 5-7 years Location: Onsite (Noida) Reports To: Technical Manager / Architect Responsibilities Manage and troubleshoot ADF and Databricks workflows, ensuring triggers, linked services, parameters, and pipelines function correctly end-to-end. Investigate and resolve complex job failures; debug Spark jobs, and analyze notebook execution graphs and logs. Lead performance optimization for ADF pipelines, partitioning strategies, and ADLS data formats (e.g., Parquet tuning). Execute and automate data pipeline deployment using Azure DevOps, ARM templates, PowerShell scripts, and Git repositories. Govern data lifecycle rules, partition retention, and enforce consistency across raw/curated zones in ADLS. Monitor resource consumption (clusters, storage, pipelines) and advise on cost-saving measures (auto-scaling, tiering, concurrency). Prepare RCA for P1/P2 incidents and support change deployment validation, rollback strategy, and UAT coordination. Review Power BI refresh bottlenecks, support L1 Power BI developer with dataset tuning and refreshing scheduling improvements. Validate SOPs and support documentation prepared by L1s, and drive process improvement via automation or standardization. Required Skills Expert in Azure Data Factory, Databricks (PySpark), Azure Data Lake Storage, Synapse. Proficient in Python, PySpark, SQL/SparkSQL, and JSON configurations. Familiar with Azure DevOps, Git for version control, and CI/CD automation. Hands-on with monitoring (Azure Monitor), diagnostics, and cost governance. Strong understanding of data security practices, IAM, RBAC, and audit trail enforcement. Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Power BI Developer Experience: 3-5 years Location: Onsite (Noida) Reports To: Senior Azure Engineer / Technical Manager / Architect Responsibilities Monitor and develop Power BI Dashboards/Reports along with Power BI dataset refreshes (scheduled/manual) and gateway status using Power BI Service logs and alerts. Identify failures in dashboard visuals, data models, or refresh schedules and initiate first-level remediation (e.g., gateway restart, re-publishing). Collaborate with Azure Engineers to validate backend refresh (ADF/Databricks) issues affecting Power BI performance. Conduct daily dashboard validation checks post refresh cycles for critical reports. Respond to report-related SRs (new access, broken visuals, workspace moves), log and track issues, and resolve within SLA. Maintain refresh logs and summary dashboards for support traceability and performance reporting. Assist with visual enhancements and dataset changes in collaboration with the L2/Architect for larger CRs. Document report issues and corrective actions, contributing to the Power BI support knowledge base. Required Skills Proficient in DAX, Power Query (M), and Power BI Service features (gateways, workspaces, sharing). Strong SQL querying and data modeling experience. Exposure to Azure SQL, Synapse, or Databricks as backend sources is a plus. Familiarity with row-level security, role-based sharing, and Power BI governance best practices. Basic understanding of ticketing tools (e.g., ServiceNow). Show more Show less
Posted 1 month ago
3.0 - 4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Azure Engineer (Azure Data Lake Operations) Experience: 3-4 years Location: Onsite (Noida) Shift: Rotational Shifts support including Weekends on need basis Reports To: Technical Manager / Architect Responsibilities Perform daily / monthly monitoring of ADF pipelines, Databricks notebooks, and ADLS health using Azure Monitor, Log Analytics, and ServiceNow. Querying data, basic troubleshooting of dataset issues, pipeline validation (in ADF/Databricks) using SQL/PySpark. Execute and track daily job health checks, validate schedule adherence, and ensure successful data ingestion runs. Triage alerts and incidents from automation tools and manual tickets; perform first-level diagnostics and routing. Monitor ADF/Databricks refreshes, failure alerts. Re-run failed jobs, validate trigger configurations, and raise SRs or incidents when beyond scope. Perform manual file availability checks and escalate delays to application or business stakeholders. Maintain operational logs and checklists for daily activities with consistent timestamping and status remarks. Acknowledge and act on alerts within SLA; provide inputs to L2 for RCA or escalation-worthy cases. Raise and track service requests in ServiceNow and maintain traceability until closure. Required Skills Proficiency in Azure Data Factory, Azure Data Lake Storage, Azure Monitor, Log Analytics. Strong SQL knowledge in writing complex queries for data validation and job verification. Working knowledge of ServiceNow or equivalent ticketing system. Exposure to Databricks and PySpark is preferred. Good understanding of Azure CLI/PowerShell is a plus. Show more Show less
Posted 1 month ago
1.0 - 5.0 years
0 - 2 Lacs
Salem
Work from Office
Job Title: AI / ML Engineer Company: Mukesh Buildtech (Stealth Startup) Work Location: Salem, Tamil Nadu About Us: Mukesh Buildtech is an innovative stealth startup focused on a new marketplace. We are building a cutting-edge platform that leverages advanced technologies to provide unparalleled user experiences. Join our dynamic team and be a part of our exciting journey from the ground up. If you're interested, please send your updated CV to sumathi@mukeshassociates.com. Role Overview: We are seeking a talented and passionate AI/ML Engineer to join our founding team. In this role, you will be responsible for developing and deploying state-of-the-art computer vision models that will be integral to our marketplace platform. You will work closely with our engineering and product teams to create innovative solutions that enhance user interactions and streamline operations. Key Responsibilities: Develop and deploy advanced computer vision models to enhance platform capabilities. Developing and deploying ML models in production. Experience with generative models, video models, or multimodal models is a plus. Research, prototype, and implement new AI features using cutting-edge computer vision techniques. Optimize model performance for speed, accuracy, and scalability. Collaborate with cross-functional teams to translate customer requirements into AI features. Contribute to the technical vision and architecture of our AI systems. Stay up-to-date with the latest advancements in computer vision research and apply them to our products. Qualifications & Experience: Min 3 yrs experience in Image Recognition, Yolo11, OpenCV, Pytorch, Tensorflow, Python, Dataset At least one Product identification web project experience BE CS, IT, ECE, EEE, MCA Experience in Machine Learning (ML), with a particular emphasis on Computer Vision (CV) Comprehensive knowledge and hands-on experience with fine-tuning approaches and training models Experience with TensorFlow, PyTorch or Keras Proficiency in Python is essential, especially with libraries such as NumPy, OpenCV, and scikit-image Experience with databases (SQL and NoSQL) for managing datasets. Understanding of APIs (REST, GraphQL) for connecting computer vision models with frontend or other backend services Knowledge of cloud services (AWS, Azure, Google Cloud) for deploying and managing models Experience with Docker and Kubernetes for containerization and deployment Understanding of software development best practices, version control (Git), and CI/CD pipelines. Why Join Us: Be a part of a pioneering team at the forefront of marketplace innovation. Work in a collaborative and dynamic environment with opportunities for growth. Contribute to a product that will have a significant impact on the industry.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France