MyHiringPartner.ai

4 Job openings at MyHiringPartner.ai
Data Engineer -MHP-2528 chennai,tamil nadu,india 0 years None Not disclosed On-site Full Time

Job description Role Overview We are seeking an experienced Data Engineer with strong expertise in SQL, Python, PySpark, Airflow, Trino, and Hive to design, develop, and optimize our data pipelines. The role involves working with large flat file datasets, orchestrating workflows using Airflow, performing data transformations using Spark, and loading the final layers into Snowflake for analytics and reporting. Key Responsibilities Data Pipeline Development: Build, maintain, and optimize data pipelines using Airflow for orchestration and scheduling. Data Ingestion & Transformation: Work with flat files (CSV, JSON, Mainframe-based files) and ensure accurate ingestion and transformation. Spark-based Processing: Use PySpark for large-scale data processing and implementing custom UDF (User Defined Functions) where needed. SQL Development: Create, optimize, and maintain SQL scripts for data manipulation, reporting, and analytics. Snowflake Data Integration: Load and manage final processed data layers in Snowflake . Data Quality & Metrics: Implement checks for file size limits, data consistency, and daily metric tracking. Collaboration & Requirements Gathering: Work with business and technical teams to understand requirements and deliver efficient data solutions. Required Skills & Experience Proficiency in SQL (query optimization, joins, indexing). Strong Python programming skills, including writing reusable functions. Hands-on experience with PySpark (adding columns, transformations, cache usage, UDF functions). Proficiency in Airflow for workflow orchestration. Familiarity with Trino and Hive query engines. Experience with flat file formats (CSV, JSON, Mainframe-based files) and data parsing strategies. Understanding of data normalization , unique constraints, and caching strategies. Experience working with Snowflake or other cloud data warehouses. Preferred Qualifications Knowledge of performance tuning in Spark and SQL. Understanding of data governance and security best practices. Experience with large file processing (including max size handling).

Data Engineer -MHP-2528 chennai,tamil nadu,india 0 years INR Not disclosed On-site Full Time

Job description Role Overview We are seeking an experienced Data Engineer with strong expertise in SQL, Python, PySpark, Airflow, Trino, and Hive to design, develop, and optimize our data pipelines. The role involves working with large flat file datasets, orchestrating workflows using Airflow, performing data transformations using Spark, and loading the final layers into Snowflake for analytics and reporting. Key Responsibilities Data Pipeline Development: Build, maintain, and optimize data pipelines using Airflow for orchestration and scheduling. Data Ingestion & Transformation: Work with flat files (CSV, JSON, Mainframe-based files) and ensure accurate ingestion and transformation. Spark-based Processing: Use PySpark for large-scale data processing and implementing custom UDF (User Defined Functions) where needed. SQL Development: Create, optimize, and maintain SQL scripts for data manipulation, reporting, and analytics. Snowflake Data Integration: Load and manage final processed data layers in Snowflake . Data Quality & Metrics: Implement checks for file size limits, data consistency, and daily metric tracking. Collaboration & Requirements Gathering: Work with business and technical teams to understand requirements and deliver efficient data solutions. Required Skills & Experience Proficiency in SQL (query optimization, joins, indexing). Strong Python programming skills, including writing reusable functions. Hands-on experience with PySpark (adding columns, transformations, cache usage, UDF functions). Proficiency in Airflow for workflow orchestration. Familiarity with Trino and Hive query engines. Experience with flat file formats (CSV, JSON, Mainframe-based files) and data parsing strategies. Understanding of data normalization , unique constraints, and caching strategies. Experience working with Snowflake or other cloud data warehouses. Preferred Qualifications Knowledge of performance tuning in Spark and SQL. Understanding of data governance and security best practices. Experience with large file processing (including max size handling). Show more Show less

Java Developer – Data Engineering -MHP-2532 chennai,tamil nadu,india 9 years None Not disclosed On-site Full Time

Position Summary The Software Engineer will assist in designing, developing, and deploying as part of a strategic modernization effort. The candidate will join a team of Engineers who will be responsible for optimizing and transforming our data architecture, infrastructure, operations, and related functions. This team will work with developers, architects, business / data analysts and data scientists on data initiatives and will ensure optimal data solutions. Essential Job Functions: Leverage modern data management toolsets and coding methods to design, build, implement, and optimize data solutions of all types – including support for OLTP and API/Microservices, data warehouses, data lakes, ODS, streaming data, analytic and BI/visualizations, etc. Transform legacy data structures and processes to modern, capable, and secure solutions in a hybrid cloud setup Apply Data Engineering & Design best practices to architect solutions, using a deep understanding of various data formats and database design approaches Design and Implement Data Services (DaaS) for consumption and various manipulations throughout the Data Ecosystem applying the “contract first” design principle and including use of API, Microservices, Microbatch, ELT Pipeline and other methods. Build best-practice driven data ingestion, develop tooling for increasing scale, accuracy, and automation in data pipeline, to integrate with decisioning, AI/NLP and consuming systems Ensure data security by designing controls and protection strategies Enable application performance and modernization by creating appropriate data capabilities to match Identify bottlenecks and issues and provide solutions to mitigate and address these issues. Makes a habit of covering most of their code with unit tests. Estimate efforts and ensure that work is completed on time. Responsible for code reviews and merges Position Requirements: Strong Experience Coding with Java, JavaScript (Nodejs), Python. Strong Knowledge of Design Patterns for Software and Data Engineering. Experience in on-prem and hybrid cloud infrastructure, including service and cost optimization Experience with production and analytics data, batch, and real time / streaming, etc. Experience with RDBMS (E.g., Oracle, SQL) as well as modern relational and unstructured data sources (like NoSQL), including cloud services (AWS/GCP/Azure). Hands on experience using tools is strongly preferred Knowledge of tools (or similar) such as Hadoop Stack, Airflow, Kafka, NiFi, PostgreSQL, Oracle, SQL Server, ElasticSearch (ELK), JSON, Parquet, Avro and other Data Storage formats, Tableau, Superset and other Visualization Tools, Apache Atlas, and other Data-centric Apache Packages Preferred: BS or advanced degree in Computer Science, or a related field 9+ years of experience in software development

OneStream Developer-MHP 2555 hyderabad,telangana,india 3 - 5 years None Not disclosed On-site Full Time

Job Title: OneStream Developer Location: Hybrid (Hyderabad/Bangalore/Pune) Employment Type: Full-Time with Mavlra Technologies Pvt Ltd Position Summary: We are seeking an experienced OneStream Developer to design, develop, and support financial consolidation, planning, and reporting solutions using OneStream Software. The ideal candidate has robust experience with OneStream XF, strong technical skills, and familiarity with financial processes to effectively translate business requirements into technical solutions. Key Responsibilities: Design, develop, and implement OneStream XF solutions, including financial consolidations, budgeting, forecasting, and reporting. Configure OneStream software, including Cube Views, Dashboards, Workflow Profiles, Business Rules, and XF MarketPlace solutions. Collaborate with finance and IT teams to gather business requirements and develop functional and technical specifications. Maintain and optimize existing OneStream applications for performance and scalability. Develop and execute comprehensive testing procedures to ensure system accuracy and stability. Provide user support, training, and documentation. Monitor and troubleshoot technical issues and recommend solutions. Qualifications: Bachelors degree in Information Systems, Computer Science, Finance, or related field. Minimum 3-5 years of hands-on experience in OneStream XF development. Proven track record of developing and deploying OneStream solutions, including complex consolidations, budgeting, and forecasting models. Proficient in VB.NET, SQL, and Excel. Experience in financial systems and processes (e.g., consolidation, financial reporting, FP&A). Excellent analytical, problem-solving, and communication skills. Ability to work independently and within cross-functional teams. Preferred Skills: OneStream XF Certification. Experience with ERP integrations. Knowledge of cloud infrastructure, data integration, and API development. Familiarity with Agile methodologies.