Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
2 - 3 Lacs
gurugram
Work from Office
Role & responsibilities Preferred candidate profile Job Description: Technician Faade Access Equipment Position: Technician Department: Operations & Maintenance – Faade Access Systems Location: Gurgaon, Noida (Delhi-NCR) Reporting To: Maintenance Manager Role Overview The Technician will be responsible for the operation, inspection, troubleshooting, and maintenance of Faade Access Equipment (BMU, Cradles, Hoists, and related electrical & mechanical systems) . The role demands good technical knowledge of electrical panels, safety devices, hoisting systems, and control assemblies to ensure equipment reliability, compliance with safety standards, and uninterrupted building faade operations. Key Responsibilities Perform routine inspection, servicing, and preventive maintenance of faade access equipment, BMU cradles, winches/hoists, and associated control systems. Troubleshoot electrical and mechanical issues in: Contactors, transformers, ELCB/MCB/RCCB, phase preventors, relays, push buttons, selector switches, NO/NC elements, limit switches, indicators, and industrial sockets. PLC and VFD-based control systems. Hoists (winches), reeled drum assemblies, and drive mechanisms. Conduct panel wiring, safety device testing, and recalibration of system components. Ensure proper functioning of emergency controls, limit switches, and interlocks as per safety standards. Maintain accurate service reports, defect identification, and rectification records . Support in installation, commissioning, and modification of faade access equipment. Ensure compliance with safety procedures, PPE usage, and statutory guidelines during operations. Assist in root cause analysis and corrective action planning in case of failures or incidents. Coordinate with supervisors and escalate issues requiring higher-level intervention. Required Technical Knowledge & Skills Strong understanding of electrical components: Contactors, transformers, ELCB/MCB/RCCB, phase preventors, relays. Switches (push, emergency, selector), NO/NC elements, limit switches, indicators, industrial sockets. Knowledge of PLC & VFD programming basics and fault diagnosis. Proficiency in panel wiring, electrical troubleshooting, and circuit testing . Hands-on experience with hoists, winches, reeled drum assemblies , and mechanical drive systems. Ability to read and interpret wiring diagrams, technical manuals, and schematic drawings . Awareness of safety protocols for faade access systems and working at heights. Qualifications & Experience ITI / Diploma in Electrical, Electronics, or Mechanical Engineering. 2–5 years of experience in maintenance of faade access systems / BMU cradles / hoists / industrial electrical equipment. Experience with high-rise building faade equipment preferred. Familiarity with preventive maintenance schedules, service reporting, and DLP/O&M requirements . Key Competencies Strong troubleshooting and analytical skills. Safety-conscious with high attention to detail. Ability to work independently and in a team. Willingness to work at heights and flexible shifts. Remuneration / Work Days / Leave Policy Salary: As per market standards and based on interview performance. Conveyance: Site travel expenses will be provided in addition to salary. Work Schedule: 6 days a week, 09:00 AM to 06:00 PM. Weekly off may not always fall on Sunday and can be assigned on other days as per requirement. Flexibility: Candidates should be willing to visit outside Delhi-NCR, work extended hours or night shifts when required.
Posted 3 days ago
1.0 - 3.0 years
0 Lacs
india
On-site
DESCRIPTION Are you excited about the digital media revolution and passionate about designing and delivering advanced analytics that directly influence the product decisions of Amazon's digital businesses. Do you see yourself as a champion of innovating on behalf of the customer by turning data insights into action The Amazon Digital Acceleration (DA) org is looking for an analytical and technically skilled data engineer to join our team. In this role, you will play a critical part in developing foundational analytical datasets spanning orders, subscriptions, discovery, promotions, pricing and royalties. Our mission is to enable digital clients to easily innovate with data on behalf of customers and make product and customer decisions faster. An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions. This individual should have strong business judgement, excellent written and verbal communication skills. Key job responsibilities 1. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (Datanet, Cradle, QuickSight etc. 2. Improve existing solutions/build solutions to improve scale, quality, IMR efficiency, data availability, consistency & compliance. 3. Partner with Software Developers, Business Intelligence Engineers, MLEs, Scientists, and Product Managers to develop scalable and maintainable data pipelines on both structured and unstructured (text based) data. 4. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations About the team The MIDAS team operates within Amazon's Digital Analytics (DA) engineering organization, building analytics and data engineering solutions that support cross-digital teams. Our platform delivers a wide range of capabilities, including metadata discovery, data lineage, customer segmentation, compliance automation, AI-driven data access through generative AI and LLMs, and advanced data quality monitoring. Today, more than 100 Amazon business and technology teams rely on MIDAS, with over 20,000 monthly active users leveraging our mission-critical tools to drive data-driven decisions at Amazon scale. BASIC QUALIFICATIONS - 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 1 week ago
1.0 - 3.0 years
0 Lacs
india
On-site
DESCRIPTION Are you excited about the digital media revolution and passionate about designing and delivering advanced analytics that directly influence the product decisions of Amazon's digital businesses. Do you see yourself as a champion of innovating on behalf of the customer by turning data insights into action The Amazon Digital Acceleration (DA) org is looking for an analytical and technically skilled data engineer to join our team. In this role, you will play a critical part in developing foundational analytical datasets spanning orders, subscriptions, discovery, promotions, pricing and royalties. Our mission is to enable digital clients to easily innovate with data on behalf of customers and make product and customer decisions faster. An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions. This individual should have strong business judgement, excellent written and verbal communication skills. Key job responsibilities 1. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (Datanet, Cradle, QuickSight etc. 2. Improve existing solutions/build solutions to improve scale, quality, IMR efficiency, data availability, consistency & compliance. 3. Partner with Software Developers, Business Intelligence Engineers, MLEs, Scientists, and Product Managers to develop scalable and maintainable data pipelines on both structured and unstructured (text based) data. 4. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations About the team The MIDAS team operates within Amazon's Digital Analytics (DA) engineering organization, building analytics and data engineering solutions that support cross-digital teams. Our platform delivers a wide range of capabilities, including metadata discovery, data lineage, customer segmentation, compliance automation, AI-driven data access through generative AI and LLMs, and advanced data quality monitoring. Today, more than 100 Amazon business and technology teams rely on MIDAS, with over 20,000 monthly active users leveraging our mission-critical tools to drive data-driven decisions at Amazon scale. BASIC QUALIFICATIONS - 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
india
On-site
DESCRIPTION Are you excited about the digital media revolution and passionate about designing and delivering advanced analytics that directly influence the product decisions of Amazon's digital businesses. Do you see yourself as a champion of innovating on behalf of the customer by turning data insights into action The Amazon Digital Acceleration (DA) org is looking for an analytical and technically skilled data engineer to join our team. In this role, you will play a critical part in developing foundational analytical datasets spanning orders, subscriptions, discovery, promotions, pricing and royalties. Our mission is to enable digital clients to easily innovate with data on behalf of customers and make product and customer decisions faster. An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions. This individual should have strong business judgement, excellent written and verbal communication skills. Key job responsibilities 1. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (Datanet, Cradle, QuickSight etc. 2. Improve existing solutions/build solutions to improve scale, quality, IMR efficiency, data availability, consistency & compliance. 3. Partner with Software Developers, Business Intelligence Engineers, MLEs, Scientists, and Product Managers to develop scalable and maintainable data pipelines on both structured and unstructured (text based) data. 4. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations About the team The MIDAS team operates within Amazon's Digital Analytics (DA) engineering organization, building analytics and data engineering solutions that support cross-digital teams. Our platform delivers a wide range of capabilities, including metadata discovery, data lineage, customer segmentation, compliance automation, AI-driven data access through generative AI and LLMs, and advanced data quality monitoring. Today, more than 100 Amazon business and technology teams rely on MIDAS, with over 20,000 monthly active users leveraging our mission-critical tools to drive data-driven decisions at Amazon scale. BASIC QUALIFICATIONS - Bachelor's degree - 3+ years of data engineering experience - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data modeling, warehousing and building ETL pipelines - Experience working on and delivering end to end projects independently - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS PREFERRED QUALIFICATIONS - 5+ years of data engineering experience - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) - Knowledge of Engineering and Operational Excellence using standard methodologies. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
india
On-site
DESCRIPTION Are you excited about the digital media revolution and passionate about designing and delivering advanced analytics that directly influence the product decisions of Amazon's digital businesses. Do you see yourself as a champion of innovating on behalf of the customer by turning data insights into action The Amazon Digital Acceleration (DA) org is looking for an analytical and technically skilled data engineer to join our team. In this role, you will play a critical part in developing foundational analytical datasets spanning orders, subscriptions, discovery, promotions, pricing and royalties. Our mission is to enable digital clients to easily innovate with data on behalf of customers and make product and customer decisions faster. An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions. This individual should have strong business judgement, excellent written and verbal communication skills. Key job responsibilities 1. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (Datanet, Cradle, QuickSight etc. 2. Improve existing solutions/build solutions to improve scale, quality, IMR efficiency, data availability, consistency & compliance. 3. Partner with Software Developers, Business Intelligence Engineers, MLEs, Scientists, and Product Managers to develop scalable and maintainable data pipelines on both structured and unstructured (text based) data. 4. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations About the team The MIDAS team operates within Amazon's Digital Analytics (DA) engineering organization, building analytics and data engineering solutions that support cross-digital teams. Our platform delivers a wide range of capabilities, including metadata discovery, data lineage, customer segmentation, compliance automation, AI-driven data access through generative AI and LLMs, and advanced data quality monitoring. Today, more than 100 Amazon business and technology teams rely on MIDAS, with over 20,000 monthly active users leveraging our mission-critical tools to drive data-driven decisions at Amazon scale. BASIC QUALIFICATIONS - Bachelor's degree - 3+ years of data engineering experience - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data modeling, warehousing and building ETL pipelines - Experience working on and delivering end to end projects independently - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS PREFERRED QUALIFICATIONS - 5+ years of data engineering experience - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) - Knowledge of Engineering and Operational Excellence using standard methodologies. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 1 week ago
7.0 - 9.0 years
0 Lacs
india
On-site
DESCRIPTION In this role as a Sr TPM - IES AI, you will work with IES teams to help identify the right AI solutions, partner with WW BDT and AWS teams to build the solution and implement across all emerging stores including IN. You will own this charter as a STL, partner with multiple data teams (BI, DE, DS) across IES, build scalable mechanisms, work with business orgs, inspect and drive adoption of AI solutions. Also, work with CoBRA team's engineers and business stakeholders to help scale Yoda's analytics solutions for IES marketplaces. Key job responsibilities . You will work independently in a high-performing environment with all levels of leadership across IES and exercise sound judgment where clear guidelines may not exist . You will communicate complex issues clearly and effectively to leadership to propose solutions and fully inform risk decisions . You will own and drive the complete product lifecycle from idea conception through implementation and wide scale deployment to eventual deprecation . You will define strategy and build and execute road maps for the programs you own. . You will be responsible for diving deep into technical reporting and data systems, understanding them well and staying connected to the details . You will define metrics and performance indicators and establish mechanisms to drive them. About the team CoBRA is the Central BI Reporting and Analytics org for IN stores and AI partner for International emerging stores. CoBRA team's mission is to empower Category and Seller orgs including Brand, Account, marketing and product/program teams with self-service products using AI (Yoda and bedrock agents), build actionable insights (Quicksight Q, Custom agents, Q- business) and help them make faster and smart decisions using science solutions across Amazon fly wheel on all inputs (Selection, Pricing and Speed). BASIC QUALIFICATIONS - 7+ years of working directly with engineering teams like BI, DE, DS and SDE - 2+ years of experience in a technical product or program management capacity - Experience managing programs across cross functional teams, building processes and coordinating release schedules - Strong proficiency in one or more programming languages (e.g., SQL, Java, Python, C++) - Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) - Experience with building back-end aggregated tables/pipelines using ANDES, AWS Cradle, S3 PREFERRED QUALIFICATIONS - Master's degree, or Advanced technical degree - 5+ years of working experience in one of the data realted job families like BI and DE - Experience defining KPI's/SLA's used to drive multi-million-dollar businesses and reporting to senior leadership - Understanding of Agile methodologies and software development life cycle - Knowledge of building AI tools, AWS bedrock agents, LLM/foundational models Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 2 weeks ago
1.0 - 5.0 years
2 - 8 Lacs
Noida
Work from Office
Job:- Job Title: CFD Engineer Experience Required: 1 to 5 years Qualification: B./M. Degree in Mechanical, Thermal, Aerospace/or related engineering disciplines Office Location: Noida Work Mode: On-Site & Full Time Working Days: 5 days Annual bonus Health insurance
Posted 2 months ago
3.0 - 5.0 years
0 - 2 Lacs
Chennai
Work from Office
Job Description: Detail design of chassis sub-systems and chassis Parts Knowledge in BOM, GD&T, DFMEA, DVVP & DFA Customer interaction Chassis frame with Cradle experience
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
64580 Jobs | Dublin
Wipro
25801 Jobs | Bengaluru
Accenture in India
21267 Jobs | Dublin 2
EY
19320 Jobs | London
Uplers
13908 Jobs | Ahmedabad
Bajaj Finserv
13382 Jobs |
IBM
13114 Jobs | Armonk
Accenture services Pvt Ltd
12227 Jobs |
Amazon
12149 Jobs | Seattle,WA
Oracle
11546 Jobs | Redwood City