Jobs
Interviews

airisDATA

6 Job openings at airisDATA
Microstrategy Developer Pune,Maharashtra,India 5 years None Not disclosed On-site Contractual

Position: MicroStrategy Engineer Experience: 5+ Years Locations: Pune, India Time type: Contract payroll : airisDATA Job Description: We are looking for a seasonal MicroStrategy Engineer · 5+ years’ experience in data and reporting technologies · knowledge and hands on experience in platforms and tools like MicroStrategy and other reporting tools · MicroStrategy skills; development of reports, cubes, performance improvements · role includes requirement gathering and completing end-to-end projects · leveraging cloud-native technologies to build and deploy data pipelines · implementation of MicroStrategy on cloud/azure · need working knowledge of Linux, oracle, hive, impala · Postgres is a plus, but not mandatory · banking domain expertise is a plus as well, but not mandatory · administration skills are a plus

IDMC Tech Lead Pune,Maharashtra,India 8 years None Not disclosed On-site Full Time

Position: IDMC Tech Lead / Architect Experience: 8+ Years Locations: Pune Notice: Immediate to 30 days Job Description: We are looking for a highly skilled Informatica IDMC (CDI and CDI-PC) Specialist or Architect to lead the migration from Informatica PowerCenter to Informatica Data Management Cloud (IDMC) . This role involves designing and implementing scalable data pipelines, ensuring data quality, and driving end-to-end cloud migration. Key Responsibilities: Lead the migration of Informatica PowerCenter to IDMC (CDI/CDI-PC) Architect end-to-end data pipelines for data ingestion, transformation, and delivery Work with stakeholders to understand requirements and design effective data solutions Provide technical leadership to developers and ensure best practices Perform performance tuning and troubleshooting of IDMC workflows Keep up with Informatica updates and industry best practices Act as an IDMC subject matter expert during client discussions and design sessions Required Skills: Strong hands-on experience with IDMC (CDI, CDI-PC) Proven experience in migrating from Informatica PowerCenter to IDMC Strong background in ETL, data integration, data quality, and governance Experience leading or architecting large-scale data projects Strong communication and stakeholder management skills

Denodo Developer / Administrator pune,maharashtra,india 8 years None Not disclosed On-site Full Time

Job Title: Junior Denodo Developer / Administrator Location: [Pune - Onsite] Experience: 3–8 years (flexible as per need) About the Role We are seeking enthusiastic and skilled Junior Denodo Developers/Administrators to join our Enterprise Data Virtualization Practice. This role is ideal for candidates who have completed Denodo training/certification and want hands-on experience in Denodo Platform Development, Administration, and Testing. Key Responsibilities · Develop and configure data sources (JDBC, JSON/XML, Excel/CSV). · Build and manage Virtual Databases, Base Views, Derived Views, Flatten Views. · Expose data services as REST/SOAP APIs and manage access controls. · Work on metadata, catalog management, caching strategies, and optimization techniques. · Support Denodo platform administration including installation, configuration, user roles, and performance monitoring. · Manage environments with Solution Manager, code promotion, and backups (Bitbucket/Azure DevOps). · Perform testing using Denodo Testing Tool (DTT) and automate test validations. · Collaborate with project teams to enable data virtualization, integration, and security. Required Skills · Denodo Developer training/course completion or certification. · Must have experience on Denodo Virtualization principles Connecting to Data Sources Views in virtual Data Port combining data sets Catalog and Metadata management · Good knowledge of SQL and data modelling concepts. · Familiarity with REST/SOAP APIs. · Strong analytical and troubleshooting skills. · Willingness to work across development, administration, or support. Nice-to-Have · Exposure to cloud platforms (Azure, AWS, GCP), preferably Azure. · Awareness of ETL tools and data integration patterns. · Experience with JMeter, Git, Bitbucket, Jenkins, or similar tools. · Knowledge of Agile delivery models.

BI Engineer(MicroStrategy Developer) pune,maharashtra,india 5 years None Not disclosed On-site Full Time

Role: BI Engineer( MicroStrategy Developer / Engineer) Experience: 5+ years Location: Pune (Work from Office – 5 days) Notice Period: Immediate / Serving Budget: As per company norms About the Role: We are looking for an experienced MicroStrategy Developer with strong expertise in data, SQL, and reporting technologies. The role involves designing, developing, and optimizing MicroStrategy solutions, building data pipelines, and working with cloud technologies for deployment. Key Responsibilities: Design, develop, and optimize MicroStrategy reports, dashboards, and cubes . Write and optimize SQL queries for reporting and data analysis. Gather requirements and deliver end-to-end reporting and analytics solutions . Implement and manage MicroStrategy on Cloud / Azure . Build and deploy data pipelines using cloud-native technologies . Improve performance, troubleshoot issues, and ensure scalability of reporting solutions. Collaborate with cross-functional teams (data, testing, business) for successful project delivery. Key Skills & Experience: Strong hands-on expertise with MicroStrategy development (reports, dashboards, cubes). Proficiency in SQL and relational databases. Working knowledge of Linux, Oracle, Hive, Impala . Familiarity with Postgres (preferred, not mandatory). Banking domain knowledge (preferred, not mandatory). MicroStrategy administration skills are an added advantage. ## Interested candidates share your resumes to # prathyusha@airisdata.com

Kafka Developer/Engineer pune,maharashtra,india 5 years None Not disclosed On-site Full Time

Position: Kafka Developer Experience: 5+ Years Locations: Pune, India ( Onsite ) Job Description: Seeking a highly skilled Kafka Developer with deep expertise in Kafka transformations and hands-on experience working with MS SQL / PostgreSQL databases especially in environments with non-standard schemas (e.g., tables without primary keys, with/without constraints, and foreign keys). The ideal candidate will be responsible for designing and implementing scalable, event based data pipelines(batch and real time) and ensuring robust CDC (Change Data Capture) mechanisms for incremental data replication. 5+ Years Deep understanding of Apache Kafka and the surrounding ecosystem (schema registries, Kafka Connect, Kafka Streams, Kafka client libraries , Spark Structured Streaming) Independently resolving issues when deploying and setting up any sort of infrastructure (like cloud services) or applications (e.g., a Spring Boot application on App Service, or a Python Azure Function) Deep understanding of Kubernetes and Docker Deep understanding/knowledge of public clouds, especially focusing on Azure, with focus on services such as Azure Functions, Azure Logic Apps, Azure App Service, Azure Kubernetes Service, OpenShift (Kubernetes in general), Azure Databricks, Azure Stream Analytics, Azure Event Hubs, Azure Service Bus, Azure Event Grid, Azure Data Lake Gen2/Azure Blob Storage Deep understanding of software design patterns and knowledge of popular programming languages (Java, C#, JavaScript, Python ) with in-depth knowledge of at least one programming language preferably Java. Managing the data pipeline. Troubleshooting issues in Kafka and provide resolutions. Deep knowledge of Kafka internals , cluster architecture, partitions, replication, and security. Exposure in handling Kafka version migrations and patch updates. Good understanding of disaster recovery setup. Implement best practices. Designing Kafka cluster , ensure data reliability and performance tuning. Exposure to capacity planning. Knowledge on Zookeeper, Schema Registry, control center, Mirror Maker 2, Cruise Control, Kafka Exporter, Kafka Connect. Good knowledge on setting up monitoring dashboards like Grafana, Prometheus. Implementing Automations in Kafka. ## Interested Candidates send your resumes to # prathyusha@airisdata.com

Kafka Developer pune,maharashtra,india 8 years None Not disclosed On-site Full Time

Job Title: Kafka Developer Location: [Pune - Onsite] Experience: 5–8 years (flexible as per need) Responsibilities: · Design and implement scalable, event-based data pipelines (batch & real-time). · Work on CDC (Change Data Capture) mechanisms for incremental data replication. · Manage and troubleshoot Kafka clusters , including partitions, replication, security, and performance tuning. · Handle Kafka version migrations , patch updates, and disaster recovery setups. · Implement automation and best practices for Kafka operations. · Work on Kafka ecosystem components such as Kafka Connect, Kafka Streams, Schema Registry, Mirror Maker 2, Zookeeper, and Cruise Control. · Set up and monitor dashboards using Grafana, Prometheus, and other tools. · Collaborate with teams to integrate Kafka with MS SQL / PostgreSQL databases, including non-standard schemas. · Deploy and manage applications/infrastructure on Azure Cloud, Kubernetes, and Docker. Required Skills: · Strong expertise in Apache Kafka and related ecosystem. · Hands-on experience with MS SQL / PostgreSQL . · Good understanding of Kubernetes, Docker, and public clouds (Azure preferred ). · Familiarity with Azure services : Functions, Logic Apps, App Service, AKS, Databricks, Stream Analytics, Event Hubs, Service Bus, Event Grid, Data Lake Gen2, Blob Storage. · Proficiency in at least one programming language (preferably Java; C#, Python, or JavaScript also valued). · Knowledge of software design patterns and strong problem-solving skills.