Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Position: Data Architect Skills: GCP, DA, Development, SQL, Python, Big Query, Dataproc, Dataflow, Data Pipelines. Exp: 10+ Yrs Roles and Responsibilities • 10+ years of relevant work experience, including previous experience leading Data related projects in the field of Reporting and Analytics. • Design, build & maintain scalable data lake and data warehouse in cloud ( GCP ) • Expertise in gathering business requirements, analysing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements • Creating solution prototype and participating in technology selection. Perform POC and technical presentations • Architect, develop and test scalable data warehouses and data pipelines architecture in Cloud Technologies ( GCP ) Experience in SQL and No SQL DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB. • Design and develop scalable ETL processes, including error handling. • Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R. • Preparing data structures for advanced analytics and self-service reporting using MS SQL, SSIS, SSRS • Write scripts for stored procedures, database snapshots backups and data archiving. • Experience with any of these cloud-based technologies: o PowerBI/Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake o AWS RedShift, Glue, Athena, AWS Quicksight o Google Cloud Platform Good to have: • Agile development environment pairing DevOps with CI/CD pipelines • AI/ML background Interested candidates share cv to dikshith.nalapatla@motivitylabs.com Show more Show less
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Skills and Qualifications: • Overall 3-5 years of hands-on experience as a Data Engineer, with at least 2-3 years of direct Azure/AWS/GCP Data Engineering experience. • Strong SQL and Python development skills are mandatory. • Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. • Demonstrated knowledge and experience with Google Cloud BigQuery is a must. • Experience with DataProc and Dataflow is highly preferred. • Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks. • Extensive experience in SQL across various database platforms. • Experience in data mapping and data modeling. • Familiarity with data analytics tools and best practices. • Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell. • Practical experience with Google Cloud services including but not limited to: o BigQuery, BigTable o Cloud Dataflow, Cloud Dataproc o Cloud Storage, Pub/Sub o Cloud Functions, Cloud Composer o Cloud Spanner, Cloud SQL • Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark). • Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. • Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. • GCP Data Engineer Certification is highly preferred
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Roles & Responsibilities: • Be responsible for the development of the conceptual, logical, and physical data models • Work with application/solution teams to implement data strategies, build data flows and develop/execute logical and physical data models • Implement and maintain data analysis scripts using SQL and Python. • Develop and support reports and dashboards using Google Plx/Data Studio/Looker. • Monitor performance and implement necessary infrastructure optimizations. • Demonstrate ability and willingness to learn quickly and complete large volumes of work with high quality. • Demonstrate excellent collaboration, interpersonal communication and written skills with the ability to work in a team environment. Minimum Qualifications • Hands-on experience with design, development, and support of data pipelines • Strong SQL programming skills (Joins, sub queries, queries with analytical functions, stored procedures, functions etc.) • Hands-on experience using statistical methods for data analysis • Experience with data platform and visualization technologies such as Google PLX dashboards, Data Studio, Tableau, Pandas, Qlik Sense, Splunk, Humio, Grafana • Experience in Web Development like HTML, CSS, jQuery, Bootstrap • Experience in Machine Learning Packages like Scikit-Learn, NumPy, SciPy, Pandas, NLTK, BeautifulSoup, Matplotlib, Statsmodels. • Strong design and development skills with meticulous attention to detail. • Familiarity with Agile Software Development practices and working in an agile environment • Strong analytical, troubleshooting and organizational skills • Ability to analyse and troubleshoot complex issues, and proficiency in multitasking • Ability to navigate ambiguity is great • BS degree in Computer Science, Math, Statistics or equivalent academic credentials
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 5+ years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: • Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. • Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services. • Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. • Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. • Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. • Develop and implement data and semantic interoperability specifications. • Work closely with business teams to define and scope requirements. • Analyze existing systems to identify appropriate data sources and drive continuous improvement. • Implement and continuously enhance automation processes for data ingestion and data transformation. • Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. • Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications: • Overall 5 years of hands-on experience as a Data Engineer, with at least 3 years of direct GCP Data Engineering experience . • Strong SQL and Python development skills are mandatory. • Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. • Demonstrated knowledge and experience with Google Cloud BigQuery is a must. • Experience with DataProc and Dataflow is highly preferred. • Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks . • Extensive experience in SQL across various database platforms. • Experience in data mapping and data modeling . • Familiarity with data analytics tools and best practices. • Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell . • Practical experience with Google Cloud services including but not limited to: o BigQuery , BigTable o Cloud Dataflow , Cloud Dataproc o Cloud Storage , Pub/Sub o Cloud Functions , Cloud Composer o Cloud Spanner , Cloud SQL • Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark ). • Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker , etc. • Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. • GCP Data Engineer Certification is highly preferred.
hyderabad, telangana
INR Not disclosed
On-site
Full Time
As a Looker Developer/Data Analyst with over 5 years of experience, you will play a crucial role in enhancing the user experience of our products and dashboards. Your primary responsibilities will revolve around collecting, analyzing, and visualizing data to identify trends and patterns that contribute to improving the overall user interface. You will collaborate with the 3PDC Leadership team to create clear and concise reports that facilitate informed decision-making. Your expertise in data analysis, visualization, and user experience design will be instrumental in driving dashboard development needs and enhancing UI improvements. Working closely with the 3PDC analytics manager, you will leverage your strong SQL skills and experience with Looker to implement data-driven changes effectively. Staying updated on the latest data analysis tools and techniques will be essential to excel in this role. In addition to your technical skills, your ability to prepare project documentation and collaborate with cross-functional teams will be key to your success. A Bachelor's degree in data science, computer science, or a related field, along with at least 3 years of experience in data analysis and visualization, are necessary qualifications for this position. Experience with ETL pipelines, Google Big Query, and GCP platform will be advantageous. If you have a passion for learning, a proactive attitude towards implementation, and excellent communication and presentation skills, we encourage you to apply for this exciting opportunity by sharing your resume at swapna.malluri@motivitylabs.com.,
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Required Skills : 5+ years of experience as a Full Stack Software Engineer, with a focus on scalable web applications. Deep experience with backend development using Java, Node.js, Python, or Go. Strong proficiency in JavaScript/TypeScript and front-end frameworks (React, Angular, or Vue.js). Expertise in building RESTful APIs and event-driven architectures. Hands-on experience with cloud platforms (AWS, GCP, Azure) and serverless computing. Strong knowledge of containerization (Docker, Kubernetes) and infrastructure as code (Terraform, Ansible). Familiarity with secure coding practices and software supply chain security principles
hyderabad, telangana
INR Not disclosed
On-site
Full Time
As a Looker developer/Data Analyst with over 5 years of experience, you will play a crucial role in enhancing the user experience of our products and dashboards. Your main responsibilities will involve collecting, analyzing, and visualizing data to identify trends and patterns that can be leveraged to improve the overall user interface. Your primary tasks will include: - Analyzing data to discover insights that enhance the user experience of our products and dashboards. - Creating clear and concise data visualizations for easy comprehension by the 3PDC Leadership team and other stakeholders. - Collaborating with the 3PDC analytics manager to implement changes based on data analysis findings. - Staying updated on the latest data analysis and visualization tools and techniques. - Generating various project documentation such as User Guides, Developer Guides, and Unit Testing validation documents. - Demonstrating a strong willingness to learn and the aptitude to effectively implement new concepts. To qualify for this role, you should possess: - A Bachelor's degree in data science, computer science, or a related field. - At least 3 years of experience in data analysis and visualization. - Proficiency in user experience design, dashboard development, data analysis, testing, and code implementation in production environments. - Strong SQL skills and hands-on experience with the data visualization tool Looker. - Experience in building Extract Transform Load (ETL) pipelines. - Familiarity with Google Big Query and the GCP platform would be advantageous. - Excellent communication and presentation skills. If you are passionate about data analysis, visualization, and enhancing user experiences, we encourage you to share your resume with us at swapna.malluri@motivitylabs.com.,
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Requirements: § Bachelor’s degree in computer science, Engineering, or a related field. A master's degree is a plus. § Minimum of 6 years of experience in test automation, with a focus on web-based applications and APIs. § Strong proficiency in test automation frameworks such as Selenium , Kotlin or similar tools. § Excellent programming skills in languages such as Java , C#, or Python. § Experience with software development and project management tools like TFS and Jira. § Experience with version control systems (e.g., Git), build tools (e.g., Maven, Gradle), and continuous integration/continuous deployment (CI/CD) pipelines. § Solid understanding and demonstration of software testing principles, methodologies, and best practices. § Proven experience in designing and implementing test strategies, test plans(manual and Automation), and test cases. § Familiarity with Agile/Scrum development methodologies and practices. § Strong analytical and problem-solving skills, with the ability to identify and troubleshoot issues. § Excellent communication and collaboration skills, with the ability to work effectively in a team environment. § Working experience in Pilot/POC projects § Experience in feasibility assessment and setup automation § Knowledge of cloud platforms (e.g., AWS, Azure) and related testing methodologies. § Understanding of security testing principles and practices. § Certifications in software testing (e.g., ISTQB, CSTE) or related fields.
noida, uttar pradesh
INR Not disclosed
On-site
Full Time
You will be responsible for the following roles and responsibilities: - Possessing a minimum of 8 years of experience in OpenText Digital Asset Management/OpenText Media Management (OTMM). - Demonstrating functional knowledge of Digital Asset Management, asset ingestions, and system integration. - Ideally having experience on OTMM version 16.5.x and higher. - Showcasing troubleshooting and analysis skills on OTMM servers and associated UI(s). - Exhibiting sound admin skills specific to OTMM, EPS, and MFT on the Admin console. - Conducting maintenance and housekeeping activities in the OTMM environment. - Configuring settings on OTMM including metadata, security, policy, user, advanced search, etc. - Performing customizations on OTMM and associated application servers. - Ensuring sync and consolidation of Media manager and directory services. - Handling OTMM patch installation and upgrades. - Integrating with external/legacy applications using OTMM REST APIs and web services. - Defining new architectures and driving independent projects. - Demonstrating excellent communication, articulation, and analytical skills. If you meet the above requirements and possess the necessary skills, please share your resume at swapna.malluri@motivitylabs.com.,
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.