Allata is a fast-growing technology strategy and data development consulting firm delivering scalable solutions to our enterprise clients. Our mission is to inspire our clients to achieve their most strategic goals through uncompromised delivery, active listening, and personal accountability. We are a group who thrive in fast-paced environments, working on complex problems, continually learning, and working alongside colleagues to be better together. We are seeking a skilled Data Engineer with strong expertise in Data Bricks and Delta Live Tables to guide and drive the evolution of our clients data ecosystem. The ideal candidate show strong technical leadership and own hands-on execution, leading the design, migration, and implementation of robust data solutions while mentoring team members and collaborating with stakeholders to achieve enterprise-wide analytics goals. Key Responsibilities ~ Collaborate in defining the overall architecture of the solution, with experience in designing, building, and maintaining reusable data products using Data Bricks, Delta Live Tables (DLT), Pyspark, and SQL. Migration of existing data pipelines to modern frameworks and ensure scalability and efficiency. ~ Develop the data infrastructure, pipeline architecture, and integration solutions while actively contributing to hands-on implementation. ~ Build and maintain scalable, efficient data processing pipelines and solutions for data-driven applications. ~ Monitor and ensure adherence to data security, privacy regulations, and compliance standards. ~ Troubleshoot and resolve complex data-related challenges and incidents in a timely manner. ~ Stay at the forefront of emerging trends and technologies in data engineering and advocate for their integration when relevant. Required Skills & Qualifications ~ Proven expertise in Data Bricks, Delta Live Tables, SQL, and Pyspark for processing and managing large data volumes. ~ Strong experience in designing and implementing dimensional models and medallion architecture. ~ Strong experience in designing and migrating existing databricks workspaces and models to Unity Catalog enabled workspaces. ~ Strong Experience creatinging and managing group Access Control Lists (ACL) and compute and governance policies in Databricks Unity Catalog. ~ Hands-on experience with modern data pipeline tools (e.g. AWS Glue, Azure Data Factory) and cloud platforms (e.g. Databricks). ~ Knowledge of cloud data lakes (e.g., Data Bricks Delta Lake, Azure Storage and/or AWS S3). ~ Demonstrated experience applying DevOps principles using Version Control and CICD for IaC and code base deployments(e.g. AzureDevops, Git, CI/CD) to data engineering projects. ~ Strong experience with batch and streaming data processing techniques and file compactization strategies. Nice-to-Have Skills ~ Familiarity with architectural best practices for building data lakes. ~ Hands on with additional Azure Services including, Message Queues, Service Bus, Cloud Storage, Virtual Cloud, Serverless ~ Compute, CloudSQL, OOP Languages and Frameworks ~ Experience with BI tools (e.g., Power BI, Tableau) and deploying data models. ~ Experience with Databricks Unity Catalog, i.e. configuring and managing data governance and access controls in a Delta Lake environment
Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices. Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance to deliver measurable results and build lasting partnerships. We at IMRIEL (An Allata Company) are looking for a Junior System & Network Administrator to assist in managing and maintaining our IT infrastructure. The ideal candidate will have a basic understanding of networking, system administration, and IT support. This role involves monitoring system performance, troubleshooting network issues, ensuring security compliance, and providing end-user support. Experience: 1 to 3 years. Location: Vadodara What youll be doing: Assisting in installing, configuring, and maintaining servers, workstations, and operating systems (Windows/macOS). Monitoring system performance and troubleshooting hardware/software issues. Supporting LAN, WAN, VPNs, and Wi-Fi configurations. Providing technical support for employees regarding hardware, software, and network issues. Assisting with security compliance, firewall management, and endpoint protection. Performing basic system backups, updates, and patch management. Documenting IT procedures, asset inventories, and troubleshooting steps. Collaborating with senior IT administrators and network engineers for process improvements. What you need: Basic Skills: Bachelors degree in Computer Science, Information Technology, or a related field. Certifications (Preferred but not mandatory): CCNA, MCSA, CompTIA Network+, CompTIA Security+. Understanding of networking fundamentals (TCP/IP, DNS, DHCP, VLANs, firewalls). Familiarity with operating systems (Windows, macOS) and troubleshooting. Basic knowledge of Active Directory, Office 365, and cloud services. Hands-on experience with hardware support (laptops, desktops, servers, printers, routers, switches). Ability to diagnose and resolve technical issues in a timely manner. Responsibilities: Assist in user account management and system access controls. Monitor and troubleshoot server and workstation performance. Perform regular updates, patching, and backup management. Support the team in network configuration, troubleshooting, and security. Assist in setting up and maintaining VPNs, LAN/WAN connections, and firewall rules. Provide helpdesk support and resolve employee technical issues. Ensure smooth hardware and software deployment. Ensure adherence to IT security policies and best practices. Support antivirus management and security monitoring. Maintain accurate records of IT assets, incidents, and resolutions. Create internal knowledge base articles for self-help resources. Good to know: Certifications such as CCNA, MCSA, CompTIA Network+, or CompTIA Security+. Knowledge of cloud platforms (AWS, Azure, Google Cloud). Experience using IT service management (ITSM) tools (e.g., Jira, ServiceNow, Freshservice). Must Have: Bachelor s degree in computer science, IT, or a related field. Ability to troubleshoot and resolve issues efficiently. Strong foundation in IT networking, system administration, and cybersecurity principles. Proactive and detail-oriented mindset with the ability to multitask. Personal Attributes: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Eager to learn and adapt to new technologies. Customer-focused mindset to ensure smooth IT operations. Team player with a willingness to assist others and take initiative. At Allata, we value differences. Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category. This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices. Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships. If you are a smart & passionate team player - then this Senior/Data Engineer [Snowflake] opportunity is for you! We at IMRIEL (An Allata Company) are looking for a Senior/Data Engineer to implement methods to improve data reliability and quality. You will combine raw information from different sources to create consistent and machine-readable formats. You will also develop and test architectures that enable data extraction and transformation for predictive or prescriptive modeling. Resourcefulness is a necessary skill in this role. If you truly love gaining new technical knowledge and can add more awesomeness to the team, you are eligible! What you'll be doing: Architecting, developing, and maintaining scalable, efficient, and fault-tolerant data pipelines to process, clean, and integrate data from diverse sources using Python and PySpark etc Designing and implementing modern Data Warehouse and Data Lake solutions on cloud platforms like Azure or AWS to support complex analytical and operational workloads Building and automating ETL/ELT workflows using advanced tools such as Snowflake, Azure Data Factory, or similar platforms, ensuring optimal performance and scalability Leveraging DBT (Data Build Tool) to define, document, and execute data transformation and modeling workflows Writing optimized SQL queries for data retrieval, aggregation, and transformation to support downstream analytics applications What you need: Basic Skills: Advanced skills in Python and PySpark for high-performance distributed data processing Proficient in creating data pipelines with orchestration frameworks like Apache Airflow or Azure Data Factory Strong experience with Snowflake, SQL Data Warehouse, and Data Lake architectures Ability to write, optimize, and troubleshoot complex SQL queries and stored procedures Deep understanding of building and managing ETL/ELT workflows using tools such as DBT, Snowflake, or Azure Data Factory Hands-on experience with cloud platforms such as Azure or AWS, including services like S3, Lambda, Glue, or Azure Blob Storage Proficient in designing and implementing data models, including star and snowflake schemas Familiarity with distributed processing systems and concepts such as Spark, Hadoop, or Databricks Responsibilities: Develop robust, efficient, and reusable pipelines to process and transform large-scale datasets using Python and PySpark Design pipeline workflows for batch and real-time data processing using orchestration tools like Apache Airflow or Azure Data Factory Implement automated data ingestion frameworks to extract data from structured, semi-structured, and unstructured sources such as APIs, FTP, and data streams Architect and optimize scalable Data Warehouse and Data Lake solutions using Snowflake, Azure Data Lake, or AWS S3 Implement partitioning, bucketing, and indexing strategies for efficient querying and data storage management Develop ETL/ELT pipelines using tools like Azure Data Factory or Snowflake to handle complex data transformations and business logic Integrate DBT (Data Build Tool) to automate data transformations, ensuring modularity and testability Ensure pipelines are optimized for cost-efficiency and high performance, leveraging features such as pushdown optimization and parallel processing Write, optimize, and troubleshoot complex SQL queries for data manipulation, aggregation, and reporting Design and implement dimensional and normalized data models (e.g., star and snowflake schemas) for analytics use cases Deploy and manage data workflows on cloud platforms (Azure or AWS) using services like AWS Glue, Azure Synapse Analytics, or Databricks Monitor resource usage and costs, implementing cost-saving measures such as data lifecycle management and auto-scaling Implement data quality frameworks to validate, clean, and enrich datasets Build self-healing mechanisms to minimize downtime and ensure the reliability of critical pipelines Optimize distributed data processing workflows for Spark by tuning configurations such as executor memory and partitioning Conduct profiling and debugging of data workflows to identify and resolve bottlenecks Collaborate with data analysts, scientists, and stakeholders to define requirements and deliver usable datasets Maintain clear and comprehensive documentation for pipelines, workflows, and architectural decisions Conduct code reviews to ensure best practices in coding and performance optimization Good To Have: Experience with real-time data processing frameworks such as Kafka or Kinesis Certifications in Snowflake Cloud Certifications. (Azure, AWS, GCP) Knowledge Knowledge of data visualization platforms such as Power BI, Tableau, or Looker for integration purposes Personal Attributes: Ability to identify, troubleshoot, and resolve complex data issues effectively Strong teamwork, communication skills and intellectual curiosity to work collaboratively and effectively with cross-functional teams Commitment to delivering high-quality, accurate, and reliable data products solutions Willingness to embrace new tools, technologies, and methodologies Innovative thinker with a proactive approach to overcoming challenges At Allata, we value differences. Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category. This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices. Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance to deliver measurable results and build lasting partnerships. If you are a smart & passionate team player - then this Senior/Data Engineer [Snowflake] opportunity is for you! We at IMRIEL (An Allata Company) are looking for a Senior/Data Engineer to implement methods to improve data reliability and quality. You will combine raw information from different sources to create consistent and machine-readable formats. You will also develop and test architectures that enable data extraction and transformation for predictive or prescriptive modeling. Resourcefulness is a necessary skill in this role. If you truly love gaining new technical knowledge and can add more awesomeness to the team, you are eligible! What youll be doing: Architecting, developing, and maintaining scalable, efficient, and fault-tolerant data pipelines to process, clean, and integrate data from diverse sources using Python and PySpark etc. Designing and implementing modern Data Warehouse and Data Lake solutions on cloud platforms like Azure or AWS to support complex analytical and operational workloads. Building and automating ETL/ELT workflows using advanced tools such as Snowflake, Azure Data Factory, or similar platforms, ensuring optimal performance and scalability. Leveraging DBT (Data Build Tool) to define, document, and execute data transformation and modeling workflows. Writing optimized SQL queries for data retrieval, aggregation, and transformation to support downstream analytics applications. What you need: Basic Skills: Advanced skills in Python and PySpark for high-performance distributed data processing. Proficient in creating data pipelines with orchestration frameworks like Apache Airflow or Azure Data Factory. Strong experience with Snowflake, SQL Data Warehouse, and Data Lake architectures. Ability to write, optimize, and troubleshoot complex SQL queries and stored procedures. Deep understanding of building and managing ETL/ELT workflows using tools such as DBT, Snowflake, or Azure Data Factory. Hands-on experience with cloud platforms such as Azure or AWS, including services like S3, Lambda, Glue, or Azure Blob Storage. Proficient in designing and implementing data models, including star and snowflake schemas. Familiarity with distributed processing systems and concepts such as Spark, Hadoop, or Databricks. Responsibilities: Develop robust, efficient, and reusable pipelines to process and transform large-scale datasets using Python and PySpark. Design pipeline workflows for batch and real-time data processing using orchestration tools like Apache Airflow or Azure Data Factory. Implement automated data ingestion frameworks to extract data from structured, semi-structured, and unstructured sources such as APIs, FTP, and data streams. Architect and optimize scalable Data Warehouse and Data Lake solutions using Snowflake, Azure Data Lake, or AWS S3. Implement partitioning, bucketing, and indexing strategies for efficient querying and data storage management. Develop ETL/ELT pipelines using tools like Azure Data Factory or Snowflake to handle complex data transformations and business logic. Integrate DBT (Data Build Tool) to automate data transformations, ensuring modularity and testability. Ensure pipelines are optimized for cost-efficiency and high performance, leveraging features such as pushdown optimization and parallel processing. Write, optimize, and troubleshoot complex SQL queries for data manipulation, aggregation, and reporting. Design and implement dimensional and normalized data models (e.g., star and snowflake schemas) for analytics use cases. Deploy and manage data workflows on cloud platforms (Azure or AWS) using services like AWS Glue, Azure Synapse Analytics, or Databricks. Monitor resource usage and costs, implementing cost-saving measures such as data lifecycle management and auto-scaling. Implement data quality frameworks to validate, clean, and enrich datasets. Build self-healing mechanisms to minimize downtime and ensure the reliability of critical pipelines. Optimize distributed data processing workflows for Spark by tuning configurations such as executor memory and partitioning. Conduct profiling and debugging of data workflows to identify and resolve bottlenecks. Collaborate with data analysts, scientists, and stakeholders to define requirements and deliver usable datasets. Maintain clear and comprehensive documentation for pipelines, workflows, and architectural decisions. Conduct code reviews to ensure best practices in coding and performance optimization. Good To Have: Experience with real-time data processing frameworks such as Kafka or Kinesis. Certifications in Snowflake. Cloud Certifications. (Azure, AWS, GCP) Knowledge Knowledge of data visualization platforms such as Power BI, Tableau, or Looker for integration purposes. Personal Attributes: Ability to identify, troubleshoot, and resolve complex data issues effectively. Strong teamwork, communication skills and intellectual curiosity to work collaboratively and effectively with cross-functional teams. Commitment to delivering high-quality, accurate, and reliable data products solutions. Willingness to embrace new tools, technologies, and methodologies. Innovative thinker with a proactive approach to overcoming challenges. At Allata, we value differences. Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category. This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Allata is a fast-growing technology strategy and data development consulting firm delivering scalable solutions to enterprise clients. The mission at Allata is to inspire clients to achieve their most strategic goals through uncompromised delivery, active listening, and personal accountability. Thriving in fast-paced environments, the team at Allata works on complex problems, continually learns, and collaborates with colleagues to achieve collective success. Allata is currently seeking an experienced Senior Accountant to oversee financial operations, ensure compliance with accounting principles, and support business decision-making. The ideal candidate for this role must possess strong technical accounting expertise, familiarity with Indian financial regulations, and the ability to work closely with cross-functional teams, including management, HR, compliance, and external accounting firms. Key Responsibilities: Financial Accounting & Reporting: - Involvement in filings of TDS, GST, income tax returns & Statutory Compliances - Preparation of error-free accounting reports and presentation of results to management - Analysis of financial information and summarization of financial status - Identification of errors and suggestions for efficiency improvements - Preparation of financial statements and budgets according to schedules - Assistance with tax audits and returns - Support for month-end and year-end close processes - Directing internal and external audits to ensure compliance - Development and documentation of business processes and accounting policies to maintain strong internal controls - Ensuring compliance with GAAP principles General Ledger & Compliance: - Management of general ledger functions, ensuring timely reconciliation and accuracy - Close collaboration with third-party accounting firms, Chartered Accountants (CAs), and auditors - Provision of technical support and advice to the Management Accountant - Review and recommendation of modifications to accounting systems and procedures - Participation in financial standards setting and forecasting - Input into the department's goal-setting process - Planning, assigning, and reviewing staff work Collaboration & Business Support: - Close collaboration with Management, HR, and Compliance teams - Collaboration with foreign clients and entities to ensure seamless financial operations - Provision of insights into Indian company laws and tax regulations - Collaboration with third-party accounting firms for compliance and audits Requirements: Basic Skills: - Proven experience in Financial Control, Accounting Supervision, or Senior Accounting roles - Thorough knowledge of basic accounting procedures and GAAP - Awareness of business trends and financial best practices - Strong understanding of financial accounting statements - Experience with general ledger functions and month-end/year-end close processes - Hands-on experience with accounting software such as Tally, FreshBooks, ZohoBooks, and QuickBooks - Advanced MS Excel skills, including VLOOKUPs and Pivot Tables - Familiarity with ERP systems, particularly NetSuite - Excellent accuracy, attention to detail, and problem-solving skills - Aptitude for numbers and quantitative analysis - Ability to identify discrepancies, analyze financial data, and suggest improvements Good To Have: - Understanding of Indian company laws and financial regulations - Experience working with foreign clients and entities - Basic understanding of US financial laws and taxation Personal Attributes: - Strong Analytical Thinking: Ability to analyze financial data, spot discrepancies, and suggest improvements - Attention to Detail: Meticulous approach to financial transactions, ensuring accuracy and compliance - Proactive & Solution-Oriented: Ability to identify challenges and implement effective solutions - Strong Communication Skills: Ability to explain complex financial concepts to non-financial stakeholders At Allata, diversity is valued. Allata is an equal opportunity employer that celebrates diversity and is committed to creating an inclusive environment for all employees. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.,