Numeric Technologies, based in India, specializes in providing data analytics, software development, and IT solution services.
Not specified
INR 6.0 - 8.0 Lacs P.A.
Hybrid
Full Time
Job Title: Data Engineer (AWS & Snowflake Admin/Support) Location: Bangalore Job Type: Full-time / Rotational Shift Experience Required: 2+ yearsJob Description:We are seeking a motivated and skilled Data Engineer with hands-on experience in AWS cloud services, Snowflake Administration/Support, and Airflow orchestration. The ideal candidate will have a solid understanding of data engineering principles, cloud technologies, and experience in building and maintaining scalable data pipelines.As a part of the Data Engineering team, you will collaborate with cross-functional teams to optimize data storage, processing, and orchestration in a cloud-based environment. You will be responsible for Snowflake administration, troubleshooting, and supporting data solutions, and ensuring the smooth operation of data workflows using Airflow.Key Responsibilities:AWS Cloud Engineering: Develop and maintain data pipelines and solutions on AWS platforms like S3, EC2, Lambda, Redshift, etc.Snowflake Administration: Administer, configure, and manage Snowflake environments; ensure optimal performance and availability of Snowflake databases.Data Pipeline Development: Design and implement scalable and efficient data pipelines using AWS, Snowflake, and Airflow.Airflow Experience: Implement and manage Apache Airflow for task orchestration, workflow automation, and scheduling data jobs.Data Modeling: Develop and optimize data models and structures for analytics and reporting needs.Troubleshooting & Support: Provide ongoing support for data systems and ensure smooth functioning of the entire data architecture.Collaborate with Teams: Work closely with data scientists, analysts, and other engineering teams to deliver high-quality data products.Security & Compliance: Ensure data security, privacy, and compliance with internal and external standards.Key Skills & Qualifications:Experience: Minimum 2 years of professional experience in Data Engineering, AWS, and Snowflake administration/support.AWS: Expertise in using AWS cloud services (S3, EC2, Lambda, Redshift, etc.) for data storage, processing, and automation.Snowflake: Hands-on experience in Snowflake administration, monitoring, query optimization, and troubleshooting.Airflow: Strong understanding and experience in designing, scheduling, and managing data workflows using Apache Airflow.SQL: Strong proficiency in SQL for querying, optimizing, and analyzing data.Scripting/Programming: Experience with programming languages such as Python, Bash, or similar for automating tasks.Data Modeling & ETL Processes: Expertise in building and optimizing ETL pipelines, working with large-scale datasets.Communication: Ability to effectively communicate with cross-functional teams and stakeholders.Shift Requirement:Rotational Shift flexibility to work in various shifts as required by the business.
Not specified
INR 6.0 - 8.0 Lacs P.A.
Hybrid
Full Time
Role & responsibilities Jira & Confluence Administrator:Responsibilities and Deliverables:Operate, maintain, and troubleshoot Jira and Confluence, part of the Atlassian product suite.Interact and support customers ad-hoc questions through Slack and other platforms.Assist customers with board configurations and tune JQL queries.Investigate and remedy customer access and permissions issues.Help customers with plug-in usage related questions.Install, test and implement new versions of tools and apps in Jira and ConfluenceCreate simple Jira workflows including project workflows, screen schemes, permission schemes, and notification schemes.Qualifications, Skills and Experience (Required):1-3 years of experience in Help Desk/Technical Support using Atlassian productsSkills and Experience (Preferred):Experience with Jira Data center, Confluence & Jira Service DeskExperience in working 24*7 shifts.Knowledge of Atlassian marketplace plugins and tools and integrationsHands-on experience with AWS (EC2, Lambda, S3, RDS)This is rotational shift
Not specified
INR 0.5 - 3.0 Lacs P.A.
Hybrid
Full Time
Job Title: Databricks AdministratorExperience: 6+ YearsLocation: Hyderabad/HybridJob Type: Full-timeJob Description:We are seeking an experienced Databricks Administrator with 6+ years of expertise in managing and optimizing Databricks environments. The ideal candidate should have hands-on experience with Azure/AWS Databricks, cluster management, security configurations, and performance optimization. This role requires close collaboration with data engineering and analytics teams to ensure smooth operations and scalability.Key Responsibilities:Deploy, configure, and manage Databricks workspaces, clusters, and jobs.Monitor and optimize Databricks performance, auto-scaling, and cost management.Implement security best practices, including role-based access control (RBAC) and encryption.Manage databricks integration with cloud storage (Azure Data Lake, S3, etc.) and other data services.Automate infrastructure provisioning and management using Terraform, ARM templates, or CloudFormation.Troubleshoot Databricks runtime issues, job failures, and performance bottlenecks.Support CI/CD pipelines for Databricks workloads and notebooks.Collaborate with data engineering teams to enhance ETL pipelines and data processing workflows.Ensure compliance with data governance policies and regulatory requirements.Maintain and upgrade Databricks versions and libraries as needed.Required Skills & Qualifications:6+ years of experience as a Databricks Administrator or in a similar role.Strong knowledge of Azure/AWS Databricks and cloud computing platforms.Hands-on experience with Databricks clusters, notebooks, libraries, and job scheduling.Expertise in Spark optimization, data caching, and performance tuning.Proficiency in Python, Scala, or SQL for data processing.Experience with Terraform, ARM templates, or CloudFormation for infrastructure automation.Familiarity with Git, DevOps, and CI/CD pipelines.Strong problem-solving skills and ability to troubleshoot Databricks-related issues.Excellent communication and stakeholder management skills.Preferred Qualifications:Databricks certifications (e.g., Databricks Certified Associate/Professional).Experience in Delta Lake, Unity Catalog, and MLflow.Knowledge of Kubernetes, Docker, and containerized workloads.Experience with big data ecosystems (Hadoop, Apache Airflow, Kafka, etc.).Why Join Us?Opportunity to work on cutting-edge Databricks solutions.Competitive salary and benefits package.Career growth and upskilling opportunities. Apply Now![Provide application instructions email, career portal, etc.]
Not specified
INR 30.0 - 35.0 Lacs P.A.
Remote
Full Time
SHIFTS-10 PM-6 AM-REMOTEImmediate joiners only.Please don't apply if you are not ok with shiftsSAP CIDS Developer Key Responsibilities: Monitor data integration tasks between HANA DBX and SAP IBP. Investigate task failures and identify issues using data integration logs. Inform the relevant teams in case of system downtimes, particularly if HANA Studio is inaccessible. Perform basic troubleshooting and coordinate with technical teams to restore failed tasks. Analyze integration templates and validate data using SAP HANA Studio and IBP interfaces. Execute and interpret basic SQL queries to assist in root cause analysis. Maintain accurate documentation of support activities and resolutions. Collaborate with other IT and functional teams to ensure seamless data flow and issue resolution.Required Skills & Qualifications: Basic hands-on experience with SAP IBP and SAP HANA Studio. Working knowledge of data integration logs and data templates in SAP IBP. Ability to write and understand basic SQL queries. Expertise with SAP CPI-DS. Good problem-solving skills and attention to detail. Strong communication skills to coordinate with technical teams effectively
Not specified
INR 30.0 - 45.0 Lacs P.A.
Work from Office
Full Time
Position: SAP FICO Senior Consultant Location: Hyderabad/Bangalore (Hybrid / Remote) Experience: 8-12 Years Employment Type: Full-Time Availability: Immediate Joiners - 45 days NoticeSAP FI / FICO Senior ConsultantExperience: 8-12 years in SAP FICO, with at least 4 years in SAP S/4HANA (2020+ preferred)Skills:SAP FI: GL, AP, AR, Asset Accounting (AA) SAP CO: Cost Center Accounting, Profit Center Accounting, Internal Orders S/4HANA Finance: Universal Journal (ACDOCA), New Asset Accounting, Margin Analysis Integration: MM, SD, PP modules Localization: GST, TDS, Withholding Tax (preferred) Role: Individual Contributor Support, Enhancements, and New ProjectsPreferred: SAP S/4HANA Certification, SAP Activate, Agile/Scrum exposureLet me know if you need further refinements!For immediate response and further opportunities, connect with me on LinkedIn: https://www.linkedin.com/in/hrushikesh-a-74a32126a/
Not specified
INR 40.0 - 50.0 Lacs P.A.
Remote
Full Time
Role: SAP BRIM ConsultantLocation: RemoteExperience: 5 - 15+ Years Total, With Minimum 3+ Years Hands-On Experience in SAP BRIMPrimary Skills:SAP BRIM (Billing and Revenue Innovation Management)SAP FI-CA (Financial Contract Accounting)SAP Convergent Invoicing (SAP CI): Billing and invoice processing.Key Responsibilities:Analyze and address billing and revenue management needs with stakeholdersSupport and optimize SAP BRIM solutions:SAP CI: Billing and invoicing.SAP FI-CA: Revenue and receivables accounting.Integrate SAP BRIM with SD and FI/CO modules.Create functional specs for custom developments; collaborate with technical teams.Resolve issues to ensure continuity and client satisfaction.Stay updated on SAP BRIM advancements for strategic guidance. Lets Connect! Email: Hrushikesh.akkala@numerictech.com Phone: 9700111702
Not specified
INR 0.5 - 3.0 Lacs P.A.
Remote
Full Time
Location: Bangalore - -Remote Overall Exp : 7 + YrsNP: ImmediateShift: 10:00 PM 6:00 AM IST (Night Shift)Senior Data Analyst (7+ Years Experience)We are looking for a Senior Data Analyst with 7+ years of experience in data analytics, business intelligence, and SQL to extract insights and support business strategies.Key Responsibilities:Analyze large datasets and develop interactive dashboards using Power BI, Tableau, or Looker.Perform statistical analysis, predictive modeling, and A/B testing.Optimize complex SQL queries and work with big data technologies (Snowflake, Hadoop, Spark).Automate data processes using Python or R.Ensure data integrity, governance, and quality checks.Required Skills:7+ years in data analytics & BI Expertise in SQL, Python, R, and ETL processes Strong experience with BI tools (Power BI, Tableau, Looker, etc.) Knowledge of statistical modeling and forecasting Familiarity with cloud data warehouses (Snowflake, Redshift, BigQuery)
Not specified
INR 18.0 - 33.0 Lacs P.A.
Hybrid
Full Time
Not specified
INR 20.0 - 35.0 Lacs P.A.
Remote
Full Time
Not specified
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
FIND ON MAP
1. Are background checks strict?
A. Yes, employment and education are verified thoroughly.
2. Do they conduct hackathons?
A. Yes, both internal and external hackathons are conducted.
3. Do they offer joining bonuses?
A. Sometimes, especially for experienced or niche skills.
4. Do they offer upskilling programs?
A. Yes, they provide training via internal platforms.
5. Is prior experience necessary?
A. Not for fresher roles, but beneficial for lateral entries.
6. Is relocation required?
A. Yes, depending on project allocation and office location.
7. What are the common coding questions?
A. Array, string manipulation, and database joins.
8. What are the growth opportunities?
A. Clear promotion cycles and cross-functional roles exist.
9. What is the notice period?
A. Typically ranges from 30 to 90 days depending on level.
10. What is their work timing?
A. Mostly 9 to 6 with flexibility depending on the team.
Gallery
Reviews
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension