Need best communication skills. Looking for a skilled data modeler with at least 8 years of experience in data modeling & a minimum of 4 years of proficiency using ER Studio . The ideal candidate will possess a deep understanding of data architecture, database design, & conceptual, logical, & physical data models. They should demonstrate expertise in ER Studio, including its advanced features, best practices, & macros. The candidate will be responsible for developing & maintaining data models to support business requirements, ensuring data quality, & collaborating with cross-functional teams to align data architecture with organizational goals. Strong analytical, problem-solving, & communication skills are essential for this role. Location - offshore ( Anywhere from India - Remote ) Need to work in EST Hours Contract- 6-month contract with extensions Interview Process- 2 video calls; hiring manager, then a panel interview Why Open- Demand for more Data Modelers in an ongoing project Intake Notes Will be part of a squad focused on a specific business domain Understanding data & doing data profiling of source data to come up with data models Established a data modeling working group Other modelers meet a few times a week to review peer review & ensure standards are being followed Top Must Haves ER Studio experience Job Types: Full-time, Contractual / Temporary Contract length: 10 months Pay: ₹3,000,000.00 - ₹3,500,000.00 per year Schedule: US shift
(Please do not apply if you have less then 9 yrs exp) - Need 10+ yrs exp Top Must Haves ****ER Studio experience**** Need best communication skills. Looking for a skilled data modeler with at least 8 years of experience in data modeling & a minimum of 4 years of proficiency using ER Studio . The ideal candidate will possess a deep understanding of data architecture, database design, & conceptual, logical, & physical data models. They should demonstrate expertise in ER Studio, including its advanced features, best practices, & macros. The candidate will be responsible for developing & maintaining data models to support business requirements, ensuring data quality, & collaborating with cross-functional teams to align data architecture with organizational goals. Strong analytical, problem-solving, & communication skills are essential for this role. Location - offshore ( Anywhere from India - Remote ) Need to work in EST Hours Contract- 6-month contract with extensions Interview Process- 2 video calls; hiring manager, then a panel interview Why Open- Demand for more Data Modelers in an ongoing project Intake Notes Will be part of a squad focused on a specific business domain Understanding data & doing data profiling of source data to come up with data models Established a data modeling working group Other modelers meet a few times a week to review peer review & ensure standards are being followed Top Must Haves ER Studio experience Job Type: Full-time Pay: ₹3,000,000.00 - ₹3,500,000.00 per year Schedule: US shift
**Need to be Databricks SME *** Location - offshore ( Anywhere from India - Remote ) - Need to work in EST Time (US shift) Need 12+ Years of experience. 5 Must Haves: 1. Data Expertise -- worked in Azure Data Bricks/Pipeline/ Shut Down Clusters--2 or more years' experience 2. Unity Catalog migration -- well versed--done tera form scripting in Dev Ops--coding & understand the code--understanding the logics of the behind the scenes--automate functionality 3. Tera Form Expertise -- code building --- 3 or more years 4. Understanding data mesh architecture -- decoupling applications -- ability to have things run in Parallel -- clear understanding -- 2 plus years of experience Microsoft Azure Cloud Platform 5. Great problem Solver Key Responsibilities: Architect, configure, & optimize Databricks Pipelines for large-scale data processing within an Azure Data Lakehouse environment. Set up & manage Azure infrastructure components including Databricks Workspaces, Azure Containers (AKS/ACI), Storage Accounts, & Networking. Design & implement a monitoring & observability framework using tools like Azure Monitor, Log Analytics, & Prometheus / Grafana. Collaborate with platform & data engineering teams to enable microservices-based architecture for scalable & modular data solutions. Drive automation & CI / CD practices using Terraform, ARM templates, & GitHub Actions/Azure DevOps. Required Skills & Experience: Strong hands - on experience with Azure Databricks, Delta Lake, & Apache Spark. Deep understanding of Azure services: Resource Manager, AKS, ACR, Key Vault, & Networking. Proven experience in microservices architecture & container orchestration. Expertise in infrastructure-as-code, scripting (Python, Bash), & DevOps tooling. Familiarity with data governance, security, & cost optimization in cloud environments. Bonus: Experience with event - driven architectures (Kafka / Event Grid). Knowledge of data mesh principles & distributed data ownership. Interview: Two rounds of interviews (1st with manager & 2nd with the team) Job Type: Full-time Pay: ₹3,400,000.00 - ₹4,500,000.00 per year Schedule: US shift
**Need to be Databricks SME *** Location - offshore ( Anywhere from India - Remote ) - Need to work in EST Time (US shift) Need 12+ Years of experience. 5 Must Haves: 1. Data Expertise -- worked in Azure Data Bricks/Pipeline/ Shut Down Clusters--2 or more years' experience 2. Unity Catalog migration -- well versed--done tera form scripting in Dev Ops--coding & understand the code--understanding the logics of the behind the scenes--automate functionality 3. Tera Form Expertise -- code building --- 3 or more years 4. Understanding data mesh architecture -- decoupling applications -- ability to have things run in Parallel -- clear understanding -- 2 plus years of experience Microsoft Azure Cloud Platform 5. Great problem Solver Key Responsibilities: Architect, configure, & optimize Databricks Pipelines for large-scale data processing within an Azure Data Lakehouse environment. Set up & manage Azure infrastructure components including Databricks Workspaces, Azure Containers (AKS/ACI), Storage Accounts, & Networking. Design & implement a monitoring & observability framework using tools like Azure Monitor, Log Analytics, & Prometheus / Grafana. Collaborate with platform & data engineering teams to enable microservices-based architecture for scalable & modular data solutions. Drive automation & CI / CD practices using Terraform, ARM templates, & GitHub Actions/Azure DevOps. Required Skills & Experience: Strong hands - on experience with Azure Databricks, Delta Lake, & Apache Spark. Deep understanding of Azure services: Resource Manager, AKS, ACR, Key Vault, & Networking. Proven experience in microservices architecture & container orchestration. Expertise in infrastructure-as-code, scripting (Python, Bash), & DevOps tooling. Familiarity with data governance, security, & cost optimization in cloud environments. Bonus: Experience with event - driven architectures (Kafka / Event Grid). Knowledge of data mesh principles & distributed data ownership. Interview: Two rounds of interviews (1st with manager & 2nd with the team) Job Type: Full-time Pay: ₹3,400,000.00 - ₹4,500,000.00 per year Schedule: US shift
**Need Databricks SME *** Location - offshore ( Anywhere from India - Remote ) - Need to work in EST Time (US shift) Need 10+ Years of experience. 5 Must Haves: 1. Data Expertise -- worked in Azure Data Bricks/Pipeline/ Shut Down Clusters--2 or more years' experience 2. Unity Catalog migration -- well versed--done tera form scripting in Dev Ops--coding & understand the code--understanding the logics of the behind the scenes--automate functionality 3. Tera Form Expertise -- code building --- 3 or more years 4. Understanding data mesh architecture -- decoupling applications -- ability to have things run in Parallel -- clear understanding -- 2 plus years of experience Microsoft Azure Cloud Platform 5. Great problem Solver Key Responsibilities: Architect, configure, & optimize Databricks Pipelines for large-scale data processing within an Azure Data Lakehouse environment. Set up & manage Azure infrastructure components including Databricks Workspaces, Azure Containers (AKS/ACI), Storage Accounts, & Networking. Design & implement a monitoring & observability framework using tools like Azure Monitor, Log Analytics, & Prometheus / Grafana. Collaborate with platform & data engineering teams to enable microservices-based architecture for scalable & modular data solutions. Drive automation & CI / CD practices using Terraform, ARM templates, & GitHub Actions/Azure DevOps. Required Skills & Experience: Strong hands - on experience with Azure Databricks, Delta Lake, & Apache Spark. Deep understanding of Azure services: Resource Manager, AKS, ACR, Key Vault, & Networking. Proven experience in microservices architecture & container orchestration. Expertise in infrastructure-as-code, scripting (Python, Bash), & DevOps tooling. Familiarity with data governance, security, & cost optimization in cloud environments. Bonus: Experience with event - driven architectures (Kafka / Event Grid). Knowledge of data mesh principles & distributed data ownership. Interview: Two rounds of interviews (1st with manager & 2nd with the team) Job Type: Full-time Pay: ₹3,000,000.00 - ₹3,400,000.00 per year Schedule: US shift