Technologies
Business Intelligence Jobs and Data Recruitment
Location
Tokyo, Japan
Type
Contract
Snowflake Architect Key Responsibilities
Architecture & Design
Design and implement scalable Snowflake data warehouse architectures aligned with business requirements.
Define data models (Star Schema, Snowflake Schema, Data Vault) and best practices for structured and semi-structured data (preferably for SAP S/4 data sets)
Architect multi-cloud or hybrid-cloud data solutions leveraging Snowflake
At least one full lifecycle SAP S/4HANA ERP transformation program experience, delivering Reporting & Analytics solutions built on Snowflake using SAP S/4HANA/SAP ECC data.
Development & Implementation
Develop and optimize complex SQL queries, stored procedures, and Snowflake tasks/streams.
Implement ELT/ETL pipelines using tools such as dbt, Informatica, Matillion, or Apache Spark.
Configure Snowflake features including Virtual Warehouses, Snowpipe, Time Travel, Data Sharing, and Fail-safe.
Performance & Optimization
Monitor and tune query performance, clustering keys, and materialized views.
Optimize cost management through warehouse sizing, auto-suspend/resume policies, and resource monitors.
Conduct capacity planning and performance benchmarking.
Security & Governance
Implement role-based access control (RBAC), row-level and column-level security.
Ensure data governance, lineage, and compliance with regulatory standards (GDPR, HIPAA, SOC 2).
Configure network policies, data masking, and encryption strategies.
Collaboration & Leadership
Work closely with data engineers, BI developers, and business analysts to understand data requirements.
Provide technical leadership, mentoring, and guidance to junior team members.
Participate in architectural reviews, sprint planning, and stakeholder presentations.
Documentation & Best Practices
Create and maintain architecture diagrams, technical documentation, and runbooks.
Establish coding standards, CI/CD pipelines, and version control practices for data assets.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
5+ years of experience in data warehousing and cloud data platform design.
3+ years of hands-on experience with Snowflake, including advanced features.
Strong proficiency in SQL and experience with scripting languages (Python, Shell, or Scala).
Experience with at least one major cloud platform: AWS, Azure, or GCP.
Proficiency in data modeling concepts (dimensional modeling, Data Vault 2.0).
Experience with ETL/ELT tools such as dbt, Informatica, Talend, or Fivetran.
Knowledge of DevOps practices including CI/CD, Git, and infrastructure-as-code (Terraform).