Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position Summary: We are looking for a skilled Node.js Developer with strong JavaScript expertise and a solid grasp of RDBMS & NoSQL to build high-performance APIs, write efficient and reusable code, implement robust security measures, and optimize data storage solutions with caching. The ideal candidate has experience designing scalable, high-availability applications and integrating multiple data sources for seamless performance. A good understanding of front-end technologies is a plus. Key Responsibilities: Responsible for managing the interchange of data between the server and the users Primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front- end Responsible for integrating the front-end elements built by your co-workers into the application. Therefore a basic understanding of front-end technologies is necessary as well Integration of user-facing elements developed by front-end developers with server side logic Writing reusable, testable, and efficient code Design and implementation of low-latency, high-availability, and performance applications Implementation of security and data protection Design and implementation of data storage solutions using RDBMS, NoSQL and Caching Databases as per the requirements Integration of data storage solutions Requirements: Strong proficiency with JavaScript Knowledge of Node. js core and frameworks Express, StrongLoop, koa, hapi etc. Understanding the nature of asynchronous programming and its quirks and workarounds Good understanding of server-side CSS pre-processors Basic understanding of front-end technologies, such as HTML5, and CSS3 Understanding accessibility and security compliance Integration of multiple data sources and data bases in to one system Understanding fundamental design principles behind a scalable application Understanding differences between multiple delivery platforms, such as mobile vs. desktop, and optimizing output to match the specific platform Creating database schemas that represent and support business processes.Hands-on expertise in MySQL/PostgreSQL, MongoDB, Redis, etc. Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Good to have - Experience working in Elastic search, RabbitMQ, Jenkins, APM Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Ahmedabad, Gujarat
On-site
Job Information Target Date 30/06/2025 Date Opened 16/06/2025 Industry IT Services Job Type Full time Work Experience 5+ years City Ahmedabad Province Gujarat Country India Postal Code 380051 Job Description Responsibilities: Contribute to development and evolution of our React Native design system, ensuring consistency, accessibility, and performance Architect and implement reusable component libraries that scale across multiple applications Design, develop, and maintain user-friendly and visually appealing mobile applications using React Native Document design system usage, guidelines, and best practices for the engineering team Collaborate closely with UI/UX designers to translate visual design systems into coded components Establish component versioning strategies and migration paths for existing implementations Stay up-to-date with the latest React Native technologies and design system methodologies Mentor and train junior engineers on design system implementation and best practices Competencies: Proven expertise in building and maintaining design systems in React Native Deep understanding of component-based architecture and atomic design principles Experience leading mobile development teams and design system initiatives Strong understanding of JavaScript, TypeScript, HTML, and CSS/styled-components Experience with state management solutions like Redux, MobX, or Context API Expertise in developing responsive, cross-platform UI components that maintain consistency Knowledge of accessibility standards and how to implement them in React Native Experience with component documentation tools (Storybook, Styleguidist, etc.) Requirements Skills: Ability to bridge design and development, translating visual systems into code Strong systems thinking and architecture planning Excellent communication skills for advocating design system adoption Attention to visual details and UI consistency across platforms Ability to work under pressure and meet deadlines Passion for creating scalable, maintainable component libraries Qualifications: Bachelor's degree in Computer Science, Design, or related field 3+ years of experience in React Native development Demonstrable portfolio of design system work or component libraries Strong understanding of software and UI design principles and best practices Good To Have: Successful implementation of at least one production design system in React Native Experience in developing mobile applications for iOS and Android with shared component libraries Experience with design tokens, theming systems, and maintaining visual consistency Experience with continuous integration for component testing and documentation Hands-on experience building, documenting, and evolving component libraries
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 1 day ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Job Summary Customer is seeking a highly skilled Data Engineer with FME expertise who are local residents of Gurugram or Bengaluru or Nagpur. Job Responsibilities 1. Data Integration & Transformation · FME (Safe Software)- Build ETL pipelines to read from Idox/CCF, transform data to given schema · FME Custom Transformers Create reusable rule logic for validations and fixes · Python (in FME or standalone)- Write custom data fix logic, date parsers, validation scripts · Data Profiling Tools-Understand completeness, accuracy, and consistency in batches 2. Spatial Data Handling · PostgreSQL/PostGIS- Store and query spatial data; support dashboard analytics · GeoPackage, GML, GeoJSON, Shapefile- Understand source file formats for ingest/export · Geometry Validators & Fixers- Fix overlaps, slivers, invalid polygons using FME or SQL · Coordinate Systems (e.g., EPSG:27700)- Ensure correct projections and alignment with target systems 3. Automation & Data Workflow Orchestration · FME Server / FME Cloud-Automate batch runs, monitor ETL pipelines · CI/CD / Cron Jobs / Python Scheduling-Trigger ingestion + dashboard refreshes on file upload · Audit Trail & Logging- Log data issues, rule hits, and processing history 4. Dashboard Integration Support · SQL for Views & Aggregations-Support dashboards showing issue counts, trends, maps · Power BI / Grafana / Superset (optional)- Assist in exposing dashboard metrics · Metadata Management- Tag each batch, status, record counts, processing stage 5. Collaborative & Communication Skills · Interpreting Validation Reports- Communicate dashboard findings to Ops and Analysts · Business Rule Translation- Convert requirements into FME transformers or SQL rules · Working with LA and HMLR Specs- Map internal formats to official schemas accurately Essential Skills · Build and maintain FME workflows to transform source data to target data specs · Validate textual and spatial fields using logic embedded in FME or SQL · Support issue triaging and reporting via dashboards · Collaborate with data provider, Analysts, and Ops for continuous improvement · ETL / Integration FME, Talend (optional), Python · Spatial DB PostGIS, Oracle Spatial · GIS Tools QGIS, ArcGIS · Scripting Python, SQL · Validation FME Testers, AttributeValidator, custom SQL views · Format Support CSV, JSON, GPKG, XML, Shapefiles · Coordination Jira, Confluence, Git (for rule versioning) Background Check required No criminal record Others · Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) · Work Location- Onsite in Gurugram or Bengaluru or Nagpur · Only local candidates apply Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The versioning job market in India is currently thriving with numerous opportunities for skilled professionals. Versioning plays a crucial role in software development, ensuring that code changes are tracked, managed, and deployed efficiently. Job seekers in India looking to pursue a career in versioning can find a variety of roles across different industries.
The average salary range for versioning professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 3-5 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the field of versioning, a typical career path may include roles such as: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
Apart from expertise in versioning tools like Git, professionals in this field may also be expected to have knowledge and experience in: - Continuous Integration/Continuous Deployment (CI/CD) - DevOps practices - Programming languages like Python, Java, or JavaScript - Cloud computing platforms like AWS or Azure
As you navigate the versioning job market in India, remember to continuously upskill, practice your technical knowledge, and showcase your expertise confidently during interviews. With determination and preparation, you can excel in your versioning career and secure exciting opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2