Jobs
Interviews

Imriel Technology Solutions

3 Job openings at Imriel Technology Solutions
Junior System & Network Administrator Vadodara 3 - 6 years INR 0.5 - 3.0 Lacs P.A. Work from Office Full Time

We at IMRIEL (An Allata Company) are looking for a System & Network Administrator to assist in managing and maintaining our IT infrastructure. The ideal candidate will have a basic understanding of networking, system administration, and IT support . This role involves monitoring system performance, troubleshooting network issues, ensuring security compliance, and providing end-user support. Experience : 3 to 6 years. Location : Vadodara What you'll be doing: Assisting in installing, configuring, and maintaining servers, workstations, and operating systems ( Windows/macOS ). Monitoring system performance and troubleshooting hardware/software issues. Supporting LAN, WAN, VPNs, and Wi-Fi configurations . Providing technical support for employees regarding hardware, software, and network issues. Assisting with security compliance, firewall management, and endpoint protection . Performing basic system backups, updates, and patch management . Documenting IT procedures, asset inventories, and troubleshooting steps. Collaborating with senior IT administrators and network engineers for process improvements. What you need: Basic Skills: Bachelor's degree in Computer Science, Information Technology, or a related field . Certifications (Preferred but not mandatory): CCNA, MCSA, CompTIA Network+, CompTIA Security+.[TP1] Understanding of networking fundamentals (TCP/IP, DNS, DHCP, VLANs, firewalls). Familiarity with operating systems (Windows, macOS) and troubleshooting. Basic knowledge of Active Directory, Office 365, and cloud services . Hands-on experience with hardware support (laptops, desktops, servers, printers, routers, switches) . Ability to diagnose and resolve technical issues in a timely manner . Responsibilities: Assist in user account management and system access controls. Monitor and troubleshoot server and workstation performance . Perform regular updates, patching, and backup management . Support the team in network configuration, troubleshooting, and security . Assist in setting up and maintaining VPNs, LAN/WAN connections, and firewall rules . Provide helpdesk support and resolve employee technical issues . Ensure smooth hardware and software deployment . Ensure adherence to IT security policies and best practices . Support antivirus management and security monitoring . Maintain accurate records of IT assets, incidents, and resolutions . Create internal knowledge base articles for self-help resources. Good to know: Certifications such as CCNA, MCSA, CompTIA Network+, or CompTIA Security+. Knowledge of cloud platforms (AWS, Azure, Google Cloud). Experience using IT service management (ITSM) tools (e.g., Jira, ServiceNow, Freshservice). Must Haves: Bachelors degree in computer science, IT, or a related field . Ability to troubleshoot and resolve issues efficiently . Strong foundation in IT networking, system administration, and cybersecurity principles . Proactive and detail-oriented mindset with the ability to multitask.

Data Engineer (Snowflake) vadodara 3 - 6 years INR 5.0 - 8.0 Lacs P.A. Work from Office Full Time

Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices. Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance to deliver measurable results and build lasting partnerships. If you are a smart & passionate team player - then this Senior/Data Engineer [Snowflake] opportunity is for you! We at IMRIEL (An Allata Company) are looking for a Senior/Data Engineer to implement methods to improve data reliability and quality. You will combine raw information from different sources to create consistent and machine-readable formats. You will also develop and test architectures that enable data extraction and transformation for predictive or prescriptive modeling. Resourcefulness is a necessary skill in this role. If you truly love gaining new technical knowledge and can add more awesomeness to the team, you are eligible! What youll be doing: Architecting, developing, and maintaining scalable, efficient, and fault-tolerant data pipelines to process, clean, and integrate data from diverse sources using Python and PySpark etc. Designing and implementing modern Data Warehouse and Data Lake solutions on cloud platforms like Azure or AWS to support complex analytical and operational workloads. Building and automating ETL/ELT workflows using advanced tools such as Snowflake, Azure Data Factory, or similar platforms, ensuring optimal performance and scalability. Leveraging DBT (Data Build Tool) to define, document, and execute data transformation and modeling workflows. Writing optimized SQL queries for data retrieval, aggregation, and transformation to support downstream analytics applications. What you need: Basic Skills: Advanced skills in Python and PySpark for high-performance distributed data processing. Proficient in creating data pipelines with orchestration frameworks like Apache Airflow or Azure Data Factory. Strong experience with Snowflake, SQL Data Warehouse, and Data Lake architectures. Ability to write, optimize, and troubleshoot complex SQL queries and stored procedures. Deep understanding of building and managing ETL/ELT workflows using tools such as DBT, Snowflake, or Azure Data Factory. Hands-on experience with cloud platforms such as Azure or AWS, including services like S3, Lambda, Glue, or Azure Blob Storage. Proficient in designing and implementing data models, including star and snowflake schemas. Familiarity with distributed processing systems and concepts such as Spark, Hadoop, or Databricks. Responsibilities: Develop robust, efficient, and reusable pipelines to process and transform large-scale datasets using Python and PySpark. Design pipeline workflows for batch and real-time data processing using orchestration tools like Apache Airflow or Azure Data Factory. Implement automated data ingestion frameworks to extract data from structured, semi-structured, and unstructured sources such as APIs, FTP, and data streams. Architect and optimize scalable Data Warehouse and Data Lake solutions using Snowflake, Azure Data Lake, or AWS S3. Implement partitioning, bucketing, and indexing strategies for efficient querying and data storage management. Develop ETL/ELT pipelines using tools like Azure Data Factory or Snowflake to handle complex data transformations and business logic. Integrate DBT (Data Build Tool) to automate data transformations, ensuring modularity and testability. Ensure pipelines are optimized for cost-efficiency and high performance, leveraging features such as pushdown optimization and parallel processing. Write, optimize, and troubleshoot complex SQL queries for data manipulation, aggregation, and reporting. Design and implement dimensional and normalized data models (e.g., star and snowflake schemas) for analytics use cases. Deploy and manage data workflows on cloud platforms (Azure or AWS) using services like AWS Glue, Azure Synapse Analytics, or Databricks. Monitor resource usage and costs, implementing cost-saving measures such as data lifecycle management and auto-scaling. Implement data quality frameworks to validate, clean, and enrich datasets. Build self-healing mechanisms to minimize downtime and ensure the reliability of critical pipelines. Optimize distributed data processing workflows for Spark by tuning configurations such as executor memory and partitioning. Conduct profiling and debugging of data workflows to identify and resolve bottlenecks. Collaborate with data analysts, scientists, and stakeholders to define requirements and deliver usable datasets. Maintain clear and comprehensive documentation for pipelines, workflows, and architectural decisions. Conduct code reviews to ensure best practices in coding and performance optimization. Good To Have: Experience with real-time data processing frameworks such as Kafka or Kinesis. Certifications in Snowflake. Cloud Certifications. (Azure, AWS, GCP) Knowledge Knowledge of data visualization platforms such as Power BI, Tableau, or Looker for integration purposes. Personal Attributes: Ability to identify, troubleshoot, and resolve complex data issues effectively. Strong teamwork, communication skills and intellectual curiosity to work collaboratively and effectively with cross-functional teams. Commitment to delivering high-quality, accurate, and reliable data products solutions. Willingness to embrace new tools, technologies, and methodologies. Innovative thinker with a proactive approach to overcoming challenges. At Allata, we value differences. Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category. This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

Sr./Data Engineer vadodara 3 - 6 years INR 0.5 - 3.0 Lacs P.A. Work from Office Full Time

If you are a smart & passionate team player - then this Senior Data Engineer [Snowflake] opportunity is for you! We at IMRIEL(An Allata Company) are looking for a Senior Data Engineer to implement methods to improve data reliability and quality. You will combine raw information from different sources to create consistent and machine-readable formats. You will also develop and test architectures that enable data extraction and transformation for predictive or prescriptive modeling. Resourcefulness is a necessary skill in this role. If you truly love gaining new technical knowledge and can add more awesomeness to the team, you are eligible! Experience : 5 to 8 years. Location : Vadodara & Pune What you'll be doing: • Lead the development and implementation of data solutions using the latest tools and technologies, with a focus on Snowflake, SQL, and data modeling. • Design and develop complex data models, including dimensional modeling, to support reporting and analytics. • Collaborate closely with Software Developers, Data Engineers, Testers, and other cross-functional teams. • Diagnose and resolve data-related issues promptly. • Select and implement the appropriate tools and frameworks for development and deployment. • Identify and implement improvements in existing data patterns and development methodologies. • Lead Data Migration & DevOps processes, working closely with Software Developers & Data Engineers. • Automate data processes to enhance efficiency. • Ensure high-quality delivery by adhering to best practices in data engineering. • Stay updated with industry trends and emerging technologies to keep the team ahead of the curve. What you need: Basic Skills: • 5 to 8 years of experience in designing and developing data solutions using Snowflake, SQL, and data modeling techniques. • Expertise in data modeling methods and tools. • Proficiency in SQL and Python for data extraction, transformation, and loading. • Hands-on experience with data warehousing, ETL processes, and data governance. • Experience with relational databases (SQL Server, Oracle, PostgreSQL, MySQL) and unstructured data systems (MongoDB, Cassandra). • Experience with any of the cloud data platforms such as Azure (Data Factory, Databricks, Synapse, Data Lake) or AWS (Lambda, Glue, Redshift). • Knowledge of data mining techniques and database management. • Skilled in optimizing the cost of data services and infrastructure. • Perform continuous monitoring, automation, and refinement of data engineering solutions. • Highly collaborative, with the ability to work effectively in team environments involving both technical and business stakeholders. • Excellent communication, problem-solving, and customer service skills, with the ability to translate technical details into non-technical information. • Strong analytical skills and meticulous attention to detail. Responsibilities: • Design, develop, and maintain scalable data architectures using Snowflake and SQL. • Lead the development of ETL pipelines and ensure data quality and consistency. • Optimize data models for performance and scalability. • Collaborate with project stakeholders to define and implement data architecture requirements. • Manage the migration of data from legacy systems to modern solutions. • Implement data quality processes for use with MDM, BI solutions, and data warehouses. • Develop strategies for data migration, database optimization, and architectural solutions for Analytics and Big Data projects. • Ensure the performance, security, and accessibility of data systems. • Troubleshoot and resolve complex technical issues in OLAP/OLTP/DW, Analytics, and Big Data environments. • Mentor other data engineers and promote a culture of continuous improvement. • Create and maintain technical documentation, manuals, and IT policies. Good to know/have: • Certifications in Snowflake. • Experience in Data Mesh architectures. • Cloud certifications (e.g., Azure, AWS, Google Cloud) • Infrastructure as Code (IaC) methodologies and tools. Personal Attributes: • A passion for continuous improvement in both technology and process. • Strong interpersonal, problem solving, and organizational skills. • Ability to work independently and manage multiple tasks and projects simultaneously. • Attention to detail and a commitment to delivering high-quality work.