prefab instance
This commit is contained in:
@@ -8,7 +8,8 @@
|
||||
"Bash(cargo tree:*)",
|
||||
"WebFetch(domain:docs.rs)",
|
||||
"Bash(findstr:*)",
|
||||
"Bash(cargo check:*)"
|
||||
"Bash(cargo check:*)",
|
||||
"Bash(ls:*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -16,3 +16,6 @@ target/
|
||||
# and can be added to the global gitignore or merged into this file. For a more nuclear
|
||||
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
||||
#.idea/
|
||||
|
||||
# Test data (cloned Unity projects for integration tests)
|
||||
test_data/
|
||||
|
||||
402
ROADMAP.md
402
ROADMAP.md
@@ -1,402 +0,0 @@
|
||||
# Cursebreaker Unity Parser - [ ] Implementation Roadmap
|
||||
|
||||
## Overview
|
||||
|
||||
This roadmap breaks down the development into 5 phases, each building on the previous. Each phase has clear deliverables and success criteria.
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Project Foundation & YAML Parsing ✅ COMPLETED
|
||||
|
||||
**Goal**: Set up project structure and implement basic YAML parsing for Unity files
|
||||
|
||||
### Tasks
|
||||
|
||||
1. **Project Setup**
|
||||
- [x] Initialize Cargo project with workspace structure
|
||||
- [x] Add core dependencies (yaml parser, serde, thiserror)
|
||||
- [x] Set up basic module structure (lib.rs, parser/, model/, error.rs)
|
||||
- [x] Configure Cargo.toml with metadata and feature flags
|
||||
|
||||
2. **Error Handling**
|
||||
- [x] Define error types (ParseError, ReferenceError, etc.)
|
||||
- [x] Implement Display and Error traits
|
||||
- [x] Set up Result type aliases
|
||||
|
||||
3. **YAML Document Parser**
|
||||
- [x] Implement Unity YAML document reader
|
||||
- [x] Parse YAML 1.1 header and Unity tags
|
||||
- [x] Split multi-document YAML files into individual documents
|
||||
- [x] Handle `%TAG !u! tag:unity3d.com,2011:` directive
|
||||
|
||||
4. **Unity Tag Parser**
|
||||
- [x] Parse Unity type tags (`!u!1`, `!u!224`, etc.)
|
||||
- [x] Extract type ID from tag
|
||||
- [x] Handle anchor IDs (`&12345`)
|
||||
|
||||
5. **Basic Testing**
|
||||
- [x] Set up test infrastructure
|
||||
- [x] Create minimal test YAML files
|
||||
- [x] Unit tests for YAML splitting and tag parsing
|
||||
- [x] Integration test: parse simple Unity file
|
||||
|
||||
### Deliverables
|
||||
- [x] ✓ Working Cargo project structure
|
||||
- [x] ✓ YAML documents successfully split from Unity files
|
||||
- [x] ✓ Unity type IDs and file IDs extracted
|
||||
- [x] ✓ Basic error handling in place
|
||||
- [x] ✓ Tests passing
|
||||
|
||||
### Success Criteria
|
||||
- [x] Can read `Scene01MainMenu.unity` and split into individual documents
|
||||
- [x] Each document has correct type ID and file ID
|
||||
- [x] No panics on malformed input (returns errors)
|
||||
|
||||
**Implementation Notes:**
|
||||
- Created comprehensive error handling with thiserror
|
||||
- Implemented regex-based Unity tag parser with caching
|
||||
- Built YAML document splitter that handles multi-document files
|
||||
- Created model with UnityFile and UnityDocument structs
|
||||
- Added 23 passing tests (12 unit, 7 integration, 4 doc tests)
|
||||
- Successfully parses real Unity files from PiratePanic sample project
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Data Model & Property Parsing
|
||||
|
||||
**Goal**: Build the core data model and parse Unity properties into structured data
|
||||
|
||||
### Tasks
|
||||
|
||||
1. **Core Data Structures**
|
||||
- [x] Implement `UnityDocument` struct
|
||||
- [x] Implement `UnityFile` struct
|
||||
- [x] Create property storage (PropertyMap using IndexMap)
|
||||
- [x] Define FileID and LocalID types
|
||||
|
||||
2. **Property Value Types**
|
||||
- [x] Implement `PropertyValue` enum (Integer, Float, String, Boolean, etc.)
|
||||
- [x] Add Vector3, Color, Quaternion value types
|
||||
- [x] Add Array and nested Object support
|
||||
- [x] Implement Debug and Display for PropertyValue
|
||||
|
||||
3. **Property Parser**
|
||||
- [x] Parse YAML mappings into PropertyMap
|
||||
- [x] Handle nested properties (paths like `m_Component[0].component`)
|
||||
- [x] Parse Unity-specific formats:
|
||||
- [x] `{fileID: N}` references
|
||||
- [x] `{x: 0, y: 0, z: 0}` vectors
|
||||
- [x] `{r: 1, g: 1, b: 1, a: 1}` colors
|
||||
- [x] `{guid: ..., type: N}` external references
|
||||
|
||||
4. **GameObject & Component Models**
|
||||
- [x] Create specialized GameObject struct
|
||||
- [x] Create base Component trait/struct
|
||||
- [x] Add common component types (Transform, RectTransform, etc.)
|
||||
- [x] Helper methods for accessing common properties
|
||||
|
||||
5. **Testing**
|
||||
- [x] Unit tests for property parsing
|
||||
- [x] Test all PropertyValue variants
|
||||
- [x] Integration test: parse GameObject with components
|
||||
- [x] Snapshot tests using sample Unity files
|
||||
|
||||
### Deliverables
|
||||
- [x] ✓ Complete data model implemented
|
||||
- [x] ✓ Properties parsed into type-safe structures
|
||||
- [x] ✓ GameObject and Component abstractions working
|
||||
- [x] ✓ All property types handled correctly
|
||||
|
||||
### Success Criteria
|
||||
- [x] Parse entire `CardGrabber.prefab` correctly
|
||||
- [x] Extract all GameObject properties (name, components list)
|
||||
- [x] Extract all Component properties with correct types
|
||||
- [x] Can access nested properties programmatically
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: Reference Resolution & Unity Type System
|
||||
|
||||
**Goal**: Resolve references between objects and implement Unity's type system
|
||||
|
||||
### Tasks
|
||||
|
||||
1. **Reference Types**
|
||||
- [x] Implement `FileReference` struct (fileID + optional GUID)
|
||||
- [x] Implement `LocalReference` (within-file references)
|
||||
- [x] Implement `ExternalReference` (cross-file GUID references)
|
||||
- [x] Add reference equality and comparison
|
||||
|
||||
2. **Type ID Mapping**
|
||||
- [x] Create Unity type ID → class name mapping
|
||||
- [x] Common types: GameObject(1), Transform(4), MonoBehaviour(114), etc.
|
||||
- [x] Load type mappings from data file or hardcode common ones
|
||||
- [x] Support unknown type IDs gracefully
|
||||
|
||||
3. **Reference Resolution**
|
||||
- [x] Implement within-file reference resolution
|
||||
- [x] Cache resolved references for performance
|
||||
- [x] Handle cyclic references safely
|
||||
- [x] Detect and report broken references
|
||||
|
||||
4. **UnityProject Multi-File Support**
|
||||
- [x] Implement `UnityProject` struct
|
||||
- [x] Load multiple Unity files into project
|
||||
- [x] Build file ID → document index
|
||||
- [x] Cross-file reference resolution (GUID-based)
|
||||
|
||||
5. **Query Helpers**
|
||||
- [x] Find object by file ID
|
||||
- [x] Find objects by type
|
||||
- [x] Find objects by name
|
||||
- [x] Get component from GameObject
|
||||
- [x] Follow reference chains
|
||||
|
||||
6. **Testing**
|
||||
- [x] Test reference resolution within single file
|
||||
- [x] Test cross-file references (scene → prefab)
|
||||
- [x] Test broken reference handling
|
||||
- [x] Test circular reference detection
|
||||
|
||||
### Deliverables
|
||||
- [x] ✓ All references within files resolved correctly
|
||||
- [x] ✓ Type ID system working with common Unity types
|
||||
- [x] ✓ UnityProject can load and query multiple files
|
||||
- [x] ✓ Query API functional
|
||||
|
||||
### Success Criteria
|
||||
- [x] Load entire PiratePanic/Scenes/ directory
|
||||
- [x] Resolve all GameObject → Component references
|
||||
- [x] Resolve prefab references from scenes
|
||||
- [x] Find objects by name across entire project
|
||||
- [x] Handle missing references gracefully
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Optimization & Robustness
|
||||
|
||||
**Goal**: Optimize performance and handle edge cases
|
||||
|
||||
### Tasks
|
||||
|
||||
1. **Performance Optimization**
|
||||
- [ ] Profile parsing performance on large files
|
||||
- [ ] Implement string interning for common property names
|
||||
- [ ] Optimize property access paths (cache lookups)
|
||||
- [ ] Consider zero-copy parsing where possible
|
||||
- [ ] Add lazy loading for large projects
|
||||
|
||||
2. **Memory Optimization**
|
||||
- [ ] Measure memory usage on large projects
|
||||
- [ ] Use Cow<str> where appropriate
|
||||
- [ ] Pool allocations for common types
|
||||
- [ ] Implement Drop for cleanup
|
||||
- [ ] Add memory usage benchmarks
|
||||
|
||||
3. **Parallel Processing**
|
||||
- [ ] Add optional rayon dependency
|
||||
- [ ] Parallel file loading
|
||||
- [ ] Parallel document parsing within files
|
||||
- [ ] Thread-safe caching
|
||||
|
||||
4. **Error Recovery**
|
||||
- [ ] Graceful degradation on parse errors
|
||||
- [ ] Partial file parsing (skip invalid documents)
|
||||
- [ ] Better error messages with context
|
||||
- [ ] Error recovery suggestions
|
||||
|
||||
5. **Edge Cases**
|
||||
- [ ] Handle very large files (>100MB scenes)
|
||||
- [ ] Handle deeply nested properties
|
||||
- [ ] Handle unusual property types
|
||||
- [ ] Handle legacy Unity versions (different YAML formats)
|
||||
- [ ] Handle corrupted files
|
||||
|
||||
6. **Comprehensive Testing**
|
||||
- [ ] Parse entire PiratePanic project
|
||||
- [ ] Parse various Unity project versions
|
||||
- [ ] Stress tests with large files
|
||||
- [ ] Fuzz testing setup (optional)
|
||||
- [ ] Property-based tests
|
||||
|
||||
### Deliverables
|
||||
- [ ] ✓ Optimized parsing (<100ms for 10MB file)
|
||||
- [ ] ✓ Low memory footprint (linear scaling)
|
||||
- [ ] ✓ Parallel parsing support
|
||||
- [ ] ✓ Robust error handling
|
||||
- [ ] ✓ Comprehensive test suite
|
||||
|
||||
### Success Criteria
|
||||
- [ ] Parse 10MB scene file in <100ms
|
||||
- [ ] Parse entire PiratePanic project in <1s
|
||||
- [ ] Memory usage < 2x file size
|
||||
- [ ] 100% of PiratePanic files parse successfully
|
||||
- [ ] No panics on malformed input
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: API Polish & Documentation
|
||||
|
||||
**Goal**: Finalize public API and create excellent documentation
|
||||
|
||||
### Tasks
|
||||
|
||||
1. **API Review & Refinement**
|
||||
- [ ] Review all public APIs for consistency
|
||||
- [ ] Add convenience methods based on common use cases
|
||||
- [ ] Ensure ergonomic API design
|
||||
- [ ] Add builder patterns where appropriate
|
||||
- [ ] Minimize unsafe code, document when necessary
|
||||
|
||||
2. **Type Safety Improvements**
|
||||
- [ ] Add type-safe component access methods
|
||||
- [ ] Strongly-typed property getters
|
||||
- [ ] Generic query API improvements
|
||||
- [ ] Consider proc macros for component definitions (optional)
|
||||
|
||||
3. **Documentation**
|
||||
- [ ] Write comprehensive rustdoc for all public items
|
||||
- [ ] Add code examples to every public function
|
||||
- [ ] Create module-level documentation
|
||||
- [ ] Write getting started guide
|
||||
- [ ] Create cookbook with common tasks
|
||||
|
||||
4. **Examples**
|
||||
- [ ] Basic parsing example
|
||||
- [ ] Query API example
|
||||
- [ ] Reference resolution example
|
||||
- [ ] Multi-file project example
|
||||
- [ ] Performance tips example
|
||||
|
||||
5. **README & Guides**
|
||||
- [ ] Professional README.md
|
||||
- [ ] Architecture documentation
|
||||
- [ ] Contributing guide
|
||||
- [ ] Changelog template
|
||||
- [ ] License file (Apache 2.0 or MIT)
|
||||
|
||||
6. **CI/CD Setup**
|
||||
- [ ] GitHub Actions workflow
|
||||
- [ ] Run tests on PR
|
||||
- [ ] Clippy lints
|
||||
- [ ] Format checking
|
||||
- [ ] Code coverage reporting
|
||||
- [ ] Benchmark tracking
|
||||
|
||||
7. **Benchmarks**
|
||||
- [ ] Benchmark suite for common operations
|
||||
- [ ] Track performance over time
|
||||
- [ ] Document performance characteristics
|
||||
- [ ] Comparison with other parsers (if any exist)
|
||||
|
||||
### Deliverables
|
||||
- [ ] ✓ Clean, documented public API
|
||||
- [ ] ✓ Comprehensive rustdoc with examples
|
||||
- [ ] ✓ README and getting started guide
|
||||
- [ ] ✓ Working examples
|
||||
- [ ] ✓ CI/CD pipeline
|
||||
|
||||
### Success Criteria
|
||||
- [ ] Every public item has rustdoc
|
||||
- [ ] At least 3 working examples
|
||||
- [ ] CI passes on all commits
|
||||
- [ ] README clearly explains usage
|
||||
- [ ] Someone new can use library from docs alone
|
||||
|
||||
---
|
||||
|
||||
## Phase 6: Future Enhancements (Post-v1.0)
|
||||
|
||||
These are potential features for future versions:
|
||||
|
||||
### Advanced Querying
|
||||
- [ ] XPath-like query language for Unity objects
|
||||
- [ ] Filter DSL for complex searches
|
||||
- [ ] Object graph traversal API
|
||||
- [ ] Dependency analysis tools
|
||||
|
||||
### Write Support
|
||||
- [ ] Modify Unity files programmatically
|
||||
- [ ] Create new Unity objects
|
||||
- [ ] Safe YAML serialization
|
||||
- [ ] Preserve formatting and comments
|
||||
|
||||
### Additional Formats
|
||||
- [ ] .meta file parsing
|
||||
- [ ] TextMesh Pro asset files
|
||||
- [ ] Unity package manifest parsing
|
||||
- [ ] C# script analysis integration
|
||||
|
||||
### Tooling
|
||||
- [ ] CLI tool built on library
|
||||
- [ ] Web service for Unity file analysis
|
||||
- [ ] VS Code extension for Unity file viewing
|
||||
- [ ] Unity Editor plugin for exporting metadata
|
||||
|
||||
### Performance
|
||||
- [ ] Binary format support (legacy Unity)
|
||||
- [ ] Streaming API for huge files
|
||||
- [ ] Incremental parsing (watch mode)
|
||||
- [ ] Serialization/deserialization optimizations
|
||||
|
||||
---
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
### Code Quality
|
||||
- [ ] Follow Rust API guidelines
|
||||
- [ ] Use clippy with strict lints
|
||||
- [ ] Maintain >80% test coverage
|
||||
- [ ] No unsafe unless absolutely necessary
|
||||
- [ ] All public APIs must be documented
|
||||
|
||||
### Testing Philosophy
|
||||
- [ ] Unit test every parser component
|
||||
- [ ] Integration tests for full workflows
|
||||
- [ ] Use real Unity files from PiratePanic
|
||||
- [ ] Add regression tests for bugs
|
||||
- [ ] Benchmark critical paths
|
||||
|
||||
### Version Strategy
|
||||
- [ ] Semantic versioning (SemVer)
|
||||
- [ ] 0.x.x during development
|
||||
- [ ] 1.0.0 when API is stable
|
||||
- [ ] Changelog for all versions
|
||||
- [ ] No breaking changes in minor versions after 1.0
|
||||
|
||||
### Dependencies
|
||||
- [ ] Minimize dependency count
|
||||
- [ ] Use well-maintained crates only
|
||||
- [ ] Avoid nightly features
|
||||
- [ ] Keep MSRV (Minimum Supported Rust Version) reasonable
|
||||
- [ ] Document all feature flags
|
||||
|
||||
---
|
||||
|
||||
## Estimated Milestones
|
||||
|
||||
These are rough estimates for a single developer working part-time:
|
||||
|
||||
- [ ] **Phase 1**: 1-2 weeks
|
||||
- [ ] **Phase 2**: 2-3 weeks
|
||||
- [ ] **Phase 3**: 2-3 weeks
|
||||
- [ ] **Phase 4**: 1-2 weeks
|
||||
- [ ] **Phase 5**: 1-2 weeks
|
||||
|
||||
**Total: 7-12 weeks to v1.0**
|
||||
|
||||
Phases can overlap and tasks can be parallelized. Testing happens continuously throughout all phases.
|
||||
|
||||
---
|
||||
|
||||
## Getting Started
|
||||
|
||||
To begin implementation:
|
||||
|
||||
1. Start with Phase 1, Task 1 (Project Setup)
|
||||
2. Work through tasks sequentially within each phase
|
||||
3. Complete all deliverables before moving to next phase
|
||||
4. Use PiratePanic sample project for testing throughout
|
||||
5. Iterate based on what you learn from the Unity files
|
||||
|
||||
Remember: Start simple, make it work, then make it fast. Focus on correctness and API design in early phases, optimization comes later.
|
||||
15
SUMMARY.md
15
SUMMARY.md
@@ -152,6 +152,7 @@ Typed accessors for Unity YAML patterns:
|
||||
- ✅ Separate code paths for scenes vs prefabs
|
||||
- ✅ Sparsey World creation with component registration
|
||||
- ✅ Entity spawning for GameObjects
|
||||
- ✅ Component Linking (Transform parent and children) with callbacks in case the component hasn't been initialized yet.
|
||||
|
||||
## ❌ What's Not Implemented
|
||||
|
||||
@@ -303,20 +304,6 @@ None currently! Code compiles cleanly in release mode.
|
||||
### Phase 1: Complete Sparsey Integration (CRITICAL)
|
||||
**Time Estimate:** 1-2 hours of research + 2-3 hours implementation
|
||||
|
||||
1. **Research Sparsey 0.13 API:**
|
||||
- Read docs at https://docs.rs/sparsey/0.13.3/
|
||||
- Look for examples in GitHub repo
|
||||
- Find component insertion and mutation APIs
|
||||
|
||||
2. **Fix Component Insertion:**
|
||||
- Implement `insert_component()` properly
|
||||
- Test with GameObject + Transform entities
|
||||
|
||||
3. **Fix Transform Hierarchy:**
|
||||
- Get mutable component access
|
||||
- Apply parent/children Entity references
|
||||
- Test with nested GameObjects
|
||||
|
||||
**Success Criteria:**
|
||||
- Parse a .unity scene with nested GameObjects
|
||||
- Verify Transform hierarchy is correctly resolved
|
||||
|
||||
245
TESTING.md
Normal file
245
TESTING.md
Normal file
@@ -0,0 +1,245 @@
|
||||
# Testing Guide
|
||||
|
||||
This document describes how to test the Cursebreaker Unity Parser against real Unity projects.
|
||||
|
||||
## Integration Tests
|
||||
|
||||
The integration test suite can automatically clone Unity projects from GitHub and parse all their files, providing detailed statistics and error reporting.
|
||||
|
||||
### Requirements
|
||||
|
||||
- **Git**: Required for cloning test projects
|
||||
- **Internet connection**: For cloning repositories (only needed on first run)
|
||||
- **Disk space**: ~100-500 MB per project
|
||||
|
||||
### Running Tests
|
||||
|
||||
#### Basic Test (VR Horror Project)
|
||||
|
||||
This test clones and parses the VR Horror Unity project:
|
||||
|
||||
```bash
|
||||
cargo test test_vr_horror_project -- --nocapture
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
============================================================
|
||||
Testing: VR_Horror_YouCantRun
|
||||
============================================================
|
||||
Cloning VR_Horror_YouCantRun from https://github.com/Unity3D-Projects/VR_Horror_YouCantRun.git...
|
||||
Finding Unity files in test_data/VR_Horror_YouCantRun...
|
||||
Found 150 Unity files
|
||||
Parsing files...
|
||||
[1/150] Parsing: SampleScene.unity
|
||||
[10/150] Parsing: Player.prefab
|
||||
...
|
||||
|
||||
============================================================
|
||||
Parsing Statistics
|
||||
============================================================
|
||||
Total files found: 150
|
||||
Scenes parsed: 15
|
||||
Prefabs parsed: 120
|
||||
Assets parsed: 15
|
||||
Total entities: 450
|
||||
Total documents: 1200
|
||||
Parse time: 250 ms
|
||||
|
||||
Success rate: 95.00%
|
||||
============================================================
|
||||
```
|
||||
|
||||
#### Detailed Parsing Test
|
||||
|
||||
This test shows detailed information about parsed files:
|
||||
|
||||
```bash
|
||||
cargo test test_vr_horror_detailed -- --nocapture
|
||||
```
|
||||
|
||||
This will:
|
||||
- Parse a sample scene file and show entity information
|
||||
- Parse a sample prefab file and test the instantiation system
|
||||
- Test the override system
|
||||
- Display component type distributions
|
||||
|
||||
#### All Projects (Including Ignored Tests)
|
||||
|
||||
```bash
|
||||
cargo test --test integration_tests -- --nocapture --ignored
|
||||
```
|
||||
|
||||
This runs tests for additional projects like PiratePanic (ignored by default because they're large).
|
||||
|
||||
#### Performance Benchmark
|
||||
|
||||
```bash
|
||||
cargo test benchmark_parsing -- --nocapture --ignored
|
||||
```
|
||||
|
||||
This measures parsing performance and provides metrics like:
|
||||
- Files per second
|
||||
- KB per second
|
||||
- Average time per file
|
||||
|
||||
### Available Test Projects
|
||||
|
||||
| Project | Description | Files | Size |
|
||||
|---------|-------------|-------|------|
|
||||
| **VR_Horror_YouCantRun** | VR horror game with complex scenes | ~150 | ~50MB |
|
||||
| **PiratePanic** | Unity Technologies sample project | ~300 | ~200MB |
|
||||
|
||||
### Test Data Location
|
||||
|
||||
Cloned projects are stored in `test_data/` (gitignored):
|
||||
```
|
||||
test_data/
|
||||
├── VR_Horror_YouCantRun/
|
||||
│ └── Assets/
|
||||
│ ├── Scenes/
|
||||
│ ├── Prefabs/
|
||||
│ └── ...
|
||||
└── PiratePanic/
|
||||
└── Assets/
|
||||
└── ...
|
||||
```
|
||||
|
||||
Projects are cloned only once and reused for subsequent test runs. Delete `test_data/` to force a fresh clone.
|
||||
|
||||
### Understanding Test Output
|
||||
|
||||
#### Success Rate
|
||||
- **>95%**: Excellent - parser handles almost all files
|
||||
- **80-95%**: Good - some edge cases not handled
|
||||
- **<80%**: Needs investigation - may indicate parser issues
|
||||
|
||||
#### Common Error Types
|
||||
- **Missing Header**: File doesn't have Unity YAML header
|
||||
- **Invalid Type Tag**: Unknown Unity type ID
|
||||
- **YAML Parsing Error**: Malformed YAML structure
|
||||
|
||||
#### Statistics
|
||||
- **Total entities**: Number of GameObjects in scenes
|
||||
- **Total documents**: Number of YAML documents in prefabs/assets
|
||||
- **Parse time**: Total time to parse all files (lower is better)
|
||||
|
||||
### Adding New Test Projects
|
||||
|
||||
To add a new Unity project to test:
|
||||
|
||||
1. Edit `tests/integration_tests.rs`
|
||||
2. Add a new project configuration:
|
||||
```rust
|
||||
const MY_PROJECT: TestProject = TestProject {
|
||||
name: "MyProject",
|
||||
repo_url: "https://github.com/user/MyProject.git",
|
||||
branch: None, // or Some("main")
|
||||
};
|
||||
```
|
||||
|
||||
3. Add a test function:
|
||||
```rust
|
||||
#[test]
|
||||
#[ignore] // Optional: ignore by default for large projects
|
||||
fn test_my_project() {
|
||||
test_project(&TestProject::MY_PROJECT);
|
||||
}
|
||||
```
|
||||
|
||||
4. Run the test:
|
||||
```bash
|
||||
cargo test test_my_project -- --nocapture --ignored
|
||||
```
|
||||
|
||||
### Continuous Integration
|
||||
|
||||
For CI/CD pipelines:
|
||||
|
||||
```bash
|
||||
# Quick smoke test (doesn't require git)
|
||||
cargo test --lib
|
||||
|
||||
# Full integration tests (requires git)
|
||||
cargo test --test integration_tests -- --nocapture
|
||||
```
|
||||
|
||||
To skip integration tests in CI environments without git:
|
||||
```bash
|
||||
cargo test --lib --bins
|
||||
```
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
#### "Git clone failed"
|
||||
- Ensure git is installed: `git --version`
|
||||
- Check internet connection
|
||||
- Verify repository URL is accessible
|
||||
|
||||
#### "Skipping test: file not found"
|
||||
- The test project hasn't been cloned yet
|
||||
- Run the test again with `--nocapture` to see clone progress
|
||||
- Check `test_data/` directory was created
|
||||
|
||||
#### High error rate
|
||||
- Check error details in test output
|
||||
- Some Unity files may use unsupported features
|
||||
- Error rate <20% is generally acceptable for parsing stress tests
|
||||
|
||||
#### Out of disk space
|
||||
- Delete `test_data/` to free up space
|
||||
- Run tests for individual projects instead of all at once
|
||||
|
||||
### Development Workflow
|
||||
|
||||
When adding new parser features:
|
||||
|
||||
1. Run integration tests to establish baseline:
|
||||
```bash
|
||||
cargo test test_vr_horror_project -- --nocapture > baseline.txt
|
||||
```
|
||||
|
||||
2. Make your changes
|
||||
|
||||
3. Re-run tests and compare:
|
||||
```bash
|
||||
cargo test test_vr_horror_project -- --nocapture > after_changes.txt
|
||||
diff baseline.txt after_changes.txt
|
||||
```
|
||||
|
||||
4. Verify:
|
||||
- Success rate didn't decrease
|
||||
- No new error types introduced
|
||||
- Parse time didn't significantly increase
|
||||
|
||||
### Performance Targets
|
||||
|
||||
- **Parse time**: <2ms per file average
|
||||
- **Memory usage**: <100MB for 1000 files
|
||||
- **Success rate**: >90% for well-formed Unity projects
|
||||
|
||||
### Example: Testing Prefab Instancing
|
||||
|
||||
The detailed test demonstrates the prefab instancing system:
|
||||
|
||||
```bash
|
||||
cargo test test_vr_horror_detailed -- --nocapture
|
||||
```
|
||||
|
||||
Look for output like:
|
||||
```
|
||||
Testing prefab instantiation:
|
||||
✓ Created instance with 45 remapped FileIDs
|
||||
✓ Override system working
|
||||
- Component types:
|
||||
- GameObject: 1
|
||||
- Transform: 1
|
||||
- RectTransform: 3
|
||||
- Canvas: 1
|
||||
```
|
||||
|
||||
This confirms that:
|
||||
1. Prefabs can be instantiated
|
||||
2. FileIDs are properly remapped
|
||||
3. The override system works
|
||||
4. All component types are recognized
|
||||
@@ -15,47 +15,60 @@ fn main() {
|
||||
// Parse the file
|
||||
match UnityFile::from_path(prefab_path) {
|
||||
Ok(file) => {
|
||||
println!("Successfully parsed: {:?}", file.path.file_name().unwrap());
|
||||
println!("Found {} documents\n", file.documents.len());
|
||||
println!("Successfully parsed: {:?}", file.path().file_name().unwrap());
|
||||
|
||||
// List all documents
|
||||
for (i, doc) in file.documents.iter().enumerate() {
|
||||
println!("Document {}: {} (Type ID: {}, File ID: {})",
|
||||
i + 1,
|
||||
doc.class_name,
|
||||
doc.type_id,
|
||||
doc.file_id
|
||||
);
|
||||
}
|
||||
// Handle the different file types
|
||||
match file {
|
||||
UnityFile::Prefab(prefab) => {
|
||||
println!("Found {} documents\n", prefab.documents.len());
|
||||
|
||||
println!();
|
||||
// List all documents
|
||||
for (i, doc) in prefab.documents.iter().enumerate() {
|
||||
println!("Document {}: {} (Type ID: {}, File ID: {})",
|
||||
i + 1,
|
||||
doc.class_name,
|
||||
doc.type_id,
|
||||
doc.file_id
|
||||
);
|
||||
}
|
||||
|
||||
// Find all GameObjects
|
||||
let game_objects = file.get_documents_by_class("GameObject");
|
||||
println!("Found {} GameObjects:", game_objects.len());
|
||||
for go in game_objects {
|
||||
if let Some(go_props) = go.get("GameObject") {
|
||||
if let Some(props) = go_props.as_object() {
|
||||
if let Some(name) = props.get("m_Name").and_then(|v| v.as_str()) {
|
||||
println!(" - {}", name);
|
||||
println!();
|
||||
|
||||
// Find all GameObjects
|
||||
let game_objects = prefab.get_documents_by_class("GameObject");
|
||||
println!("Found {} GameObjects:", game_objects.len());
|
||||
for go in game_objects {
|
||||
if let Some(mapping) = go.as_mapping() {
|
||||
if let Some(go_obj) = mapping.get("GameObject") {
|
||||
if let Some(props) = go_obj.as_mapping() {
|
||||
if let Some(name) = props.get("m_Name").and_then(|v| v.as_str()) {
|
||||
println!(" - {}", name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
println!();
|
||||
|
||||
// Find all Transforms
|
||||
let transforms = prefab.get_documents_by_type(224); // RectTransform type ID
|
||||
println!("Found {} RectTransforms", transforms.len());
|
||||
|
||||
// Look up a specific document by file ID
|
||||
if let Some(first_doc) = prefab.documents.first() {
|
||||
let file_id = first_doc.file_id;
|
||||
if let Some(found) = prefab.get_document(file_id) {
|
||||
println!("\nLooking up document by file ID {}:", file_id);
|
||||
println!(" Class: {}", found.class_name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
println!();
|
||||
|
||||
// Find all Transforms
|
||||
let transforms = file.get_documents_by_type(224); // RectTransform type ID
|
||||
println!("Found {} RectTransforms", transforms.len());
|
||||
|
||||
// Look up a specific document by file ID
|
||||
if let Some(first_doc) = file.documents.first() {
|
||||
let file_id = first_doc.file_id;
|
||||
if let Some(found) = file.get_document(file_id) {
|
||||
println!("\nLooking up document by file ID {}:", file_id);
|
||||
println!(" Class: {}", found.class_name);
|
||||
println!(" Properties: {} keys", found.properties.len());
|
||||
UnityFile::Scene(scene) => {
|
||||
println!("This is a scene file with {} entities", scene.entity_map.len());
|
||||
}
|
||||
UnityFile::Asset(asset) => {
|
||||
println!("This is an asset file with {} documents", asset.documents.len());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,8 +2,8 @@
|
||||
|
||||
use crate::model::RawDocument;
|
||||
use crate::types::{
|
||||
yaml_helpers, ComponentContext, FileID, GameObject, LinkingContext, RectTransform, Transform,
|
||||
UnityComponent,
|
||||
yaml_helpers, ComponentContext, FileID, GameObject, LinkingContext, PrefabInstanceComponent,
|
||||
RectTransform, Transform, UnityComponent,
|
||||
};
|
||||
use crate::{Error, Result};
|
||||
use sparsey::{Entity, World};
|
||||
@@ -30,6 +30,7 @@ pub fn build_world_from_documents(
|
||||
.register::<GameObject>()
|
||||
.register::<Transform>()
|
||||
.register::<RectTransform>()
|
||||
.register::<PrefabInstanceComponent>()
|
||||
.build();
|
||||
|
||||
let linking_ctx = RefCell::new(LinkingContext::new());
|
||||
@@ -51,6 +52,57 @@ pub fn build_world_from_documents(
|
||||
Ok((world, entity_map))
|
||||
}
|
||||
|
||||
/// Build entities from raw Unity documents into an existing world
|
||||
///
|
||||
/// This is similar to `build_world_from_documents` but spawns into an existing
|
||||
/// world instead of creating a new one. This is used for prefab instantiation.
|
||||
///
|
||||
/// Uses the same 3-pass approach:
|
||||
/// 1. Create entities for all GameObjects
|
||||
/// 2. Attach components (Transform, RectTransform, etc.) to entities
|
||||
/// 3. Resolve Transform hierarchy (parent/children Entity references)
|
||||
///
|
||||
/// # Arguments
|
||||
/// - `documents`: Parsed Unity documents to build entities from
|
||||
/// - `world`: Existing Sparsey ECS world to spawn entities into
|
||||
/// - `entity_map`: Existing entity map to merge new mappings into
|
||||
///
|
||||
/// # Returns
|
||||
/// Vec of newly spawned entities
|
||||
pub fn build_world_from_documents_into(
|
||||
documents: Vec<RawDocument>,
|
||||
world: &mut World,
|
||||
entity_map: &mut HashMap<FileID, Entity>,
|
||||
) -> Result<Vec<Entity>> {
|
||||
let linking_ctx = RefCell::new(LinkingContext::new());
|
||||
|
||||
// Initialize linking context with existing entity_map
|
||||
// This allows cross-references between prefab instances and scene entities
|
||||
*linking_ctx.borrow_mut().entity_map_mut() = entity_map.clone();
|
||||
|
||||
let mut spawned_entities = Vec::new();
|
||||
|
||||
// PASS 1: Create entities for all GameObjects
|
||||
for doc in documents.iter().filter(|d| d.type_id == 1 || d.class_name == "GameObject") {
|
||||
let entity = spawn_game_object(world, doc)?;
|
||||
linking_ctx.borrow_mut().entity_map_mut().insert(doc.file_id, entity);
|
||||
spawned_entities.push(entity);
|
||||
}
|
||||
|
||||
// PASS 2: Attach components to entities
|
||||
for doc in documents.iter().filter(|d| d.type_id != 1 && d.class_name != "GameObject") {
|
||||
attach_component(world, doc, &linking_ctx)?;
|
||||
}
|
||||
|
||||
// PASS 3: Execute all deferred linking callbacks
|
||||
let final_entity_map = linking_ctx.into_inner().execute_callbacks(world);
|
||||
|
||||
// Update caller's entity_map with new mappings
|
||||
entity_map.extend(final_entity_map);
|
||||
|
||||
Ok(spawned_entities)
|
||||
}
|
||||
|
||||
/// Spawn a GameObject entity
|
||||
fn spawn_game_object(world: &mut World, doc: &RawDocument) -> Result<Entity> {
|
||||
let yaml = doc
|
||||
@@ -130,6 +182,13 @@ fn attach_component(
|
||||
linking_ctx.borrow_mut().entity_map_mut().insert(doc.file_id, entity);
|
||||
}
|
||||
}
|
||||
"PrefabInstance" => {
|
||||
// Parse and store nested prefab reference
|
||||
if let Some(prefab_comp) = PrefabInstanceComponent::parse(yaml, &ctx) {
|
||||
world.insert(entity, (prefab_comp,));
|
||||
linking_ctx.borrow_mut().entity_map_mut().insert(doc.file_id, entity);
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
// Unknown component type - skip with warning
|
||||
eprintln!(
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
|
||||
mod builder;
|
||||
|
||||
pub use builder::build_world_from_documents;
|
||||
pub use builder::{build_world_from_documents, build_world_from_documents_into};
|
||||
|
||||
// TODO: Add project-level world building once UnityProject is updated to work with new architecture
|
||||
// pub use builder::build_world_from_project;
|
||||
|
||||
@@ -43,6 +43,7 @@ pub use parser::{meta::MetaFile, parse_unity_file};
|
||||
pub use property::PropertyValue;
|
||||
pub use types::{
|
||||
get_class_name, get_type_id, Color, ComponentContext, ExternalRef, FileID, FileRef,
|
||||
GameObject, LocalID, Quaternion, RectTransform, Transform, UnityComponent, UnityReference,
|
||||
Vector2, Vector3, yaml_helpers,
|
||||
GameObject, LocalID, PrefabInstance, PrefabInstanceComponent, PrefabModification,
|
||||
PrefabResolver, Quaternion, RectTransform, Transform, UnityComponent, UnityReference, Vector2,
|
||||
Vector3, yaml_helpers,
|
||||
};
|
||||
|
||||
@@ -123,6 +123,24 @@ impl UnityPrefab {
|
||||
.filter(|doc| doc.class_name == class_name)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Create a new instance of this prefab for spawning into a scene
|
||||
///
|
||||
/// This clones the prefab's documents and prepares them for instantiation
|
||||
/// with unique FileIDs to avoid collisions.
|
||||
///
|
||||
/// # Returns
|
||||
/// A `PrefabInstance` that can be customized with overrides and spawned
|
||||
///
|
||||
/// # Example
|
||||
/// ```ignore
|
||||
/// let mut instance = prefab.instantiate();
|
||||
/// instance.override_value(file_id, "m_Name", "Player1".into())?;
|
||||
/// let entities = instance.spawn_into(&mut world, &mut entity_map)?;
|
||||
/// ```
|
||||
pub fn instantiate(&self) -> crate::types::PrefabInstance {
|
||||
crate::types::PrefabInstance::new(self)
|
||||
}
|
||||
}
|
||||
|
||||
/// A Unity asset file with raw YAML
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
mod component;
|
||||
mod game_object;
|
||||
mod ids;
|
||||
mod prefab_instance;
|
||||
mod reference;
|
||||
mod transform;
|
||||
mod type_registry;
|
||||
@@ -15,6 +16,9 @@ mod values;
|
||||
pub use component::{yaml_helpers, ComponentContext, LinkCallback, LinkingContext, UnityComponent};
|
||||
pub use game_object::GameObject;
|
||||
pub use ids::{FileID, LocalID};
|
||||
pub use prefab_instance::{
|
||||
PrefabInstance, PrefabInstanceComponent, PrefabModification, PrefabResolver,
|
||||
};
|
||||
pub use reference::UnityReference;
|
||||
pub use transform::{RectTransform, Transform};
|
||||
pub use type_registry::{get_class_name, get_type_id};
|
||||
|
||||
684
src/types/prefab_instance.rs
Normal file
684
src/types/prefab_instance.rs
Normal file
@@ -0,0 +1,684 @@
|
||||
//! Prefab instancing system for cloning and spawning Unity prefabs
|
||||
|
||||
use crate::model::{RawDocument, UnityPrefab};
|
||||
use crate::types::{yaml_helpers, ComponentContext, ExternalRef, FileID, UnityComponent};
|
||||
use crate::{Error, Result};
|
||||
use serde_yaml::{Mapping, Value};
|
||||
use sparsey::{Entity, World};
|
||||
use std::collections::HashMap;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
/// An instance of a Unity prefab ready for spawning into a scene
|
||||
///
|
||||
/// PrefabInstance represents a cloned prefab with unique FileIDs to avoid
|
||||
/// collisions when spawning multiple instances. It supports:
|
||||
/// - FileID remapping to ensure uniqueness
|
||||
/// - YAML value overrides before spawning
|
||||
/// - Spawning into existing ECS worlds
|
||||
///
|
||||
/// # Example
|
||||
/// ```ignore
|
||||
/// let prefab = /* load UnityPrefab */;
|
||||
/// let mut instance = prefab.instantiate();
|
||||
/// instance.override_value(file_id, "m_Name", "Player1".into())?;
|
||||
/// instance.override_value(file_id, "m_LocalPosition.x", 100.0.into())?;
|
||||
/// let entities = instance.spawn_into(&mut world, &mut entity_map)?;
|
||||
/// ```
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct PrefabInstance {
|
||||
/// Cloned YAML documents from the source prefab
|
||||
documents: Vec<RawDocument>,
|
||||
|
||||
/// FileID remapping table: original FileID → new FileID
|
||||
/// This ensures no collisions when spawning into existing worlds
|
||||
file_id_map: HashMap<FileID, FileID>,
|
||||
|
||||
/// Overrides to apply before spawning
|
||||
/// Maps (original FileID, YAML path) → new value
|
||||
overrides: HashMap<(FileID, String), Value>,
|
||||
|
||||
/// Sequential counter for generating new FileIDs
|
||||
/// Starts at i64::MAX and decrements to avoid collisions with scene FileIDs
|
||||
next_file_id: i64,
|
||||
|
||||
/// Source prefab path for debugging
|
||||
source_path: PathBuf,
|
||||
}
|
||||
|
||||
impl PrefabInstance {
|
||||
/// Create a new instance from a Unity prefab
|
||||
///
|
||||
/// This clones all documents from the prefab and initializes FileID remapping.
|
||||
pub fn new(prefab: &UnityPrefab) -> Self {
|
||||
// Clone all documents from the prefab
|
||||
let documents = prefab.documents.clone();
|
||||
|
||||
let mut instance = Self {
|
||||
documents,
|
||||
file_id_map: HashMap::new(),
|
||||
overrides: HashMap::new(),
|
||||
next_file_id: i64::MAX,
|
||||
source_path: prefab.path.clone(),
|
||||
};
|
||||
|
||||
// Initialize FileID mapping and remap all references
|
||||
instance.initialize_file_id_mapping();
|
||||
instance.remap_yaml_file_refs();
|
||||
|
||||
instance
|
||||
}
|
||||
|
||||
/// Generate a new unique FileID
|
||||
///
|
||||
/// Uses a sequential counter starting from i64::MAX and decrementing.
|
||||
/// This avoids collisions with typical scene FileIDs which are positive.
|
||||
fn generate_file_id(&mut self) -> FileID {
|
||||
let id = self.next_file_id;
|
||||
self.next_file_id -= 1;
|
||||
FileID::from_i64(id)
|
||||
}
|
||||
|
||||
/// Initialize FileID remapping for all documents
|
||||
///
|
||||
/// Creates a mapping from original FileID → new unique FileID for each document.
|
||||
fn initialize_file_id_mapping(&mut self) {
|
||||
// Collect original IDs first to avoid borrowing conflicts
|
||||
let original_ids: Vec<FileID> = self.documents.iter().map(|doc| doc.file_id).collect();
|
||||
|
||||
for original_id in original_ids {
|
||||
let new_id = self.generate_file_id();
|
||||
self.file_id_map.insert(original_id, new_id);
|
||||
}
|
||||
}
|
||||
|
||||
/// Remap all FileID references in YAML documents
|
||||
///
|
||||
/// This recursively traverses all YAML values and replaces FileID references
|
||||
/// with their remapped values from `file_id_map`.
|
||||
fn remap_yaml_file_refs(&mut self) {
|
||||
// First, update each document's own file_id
|
||||
for doc in &mut self.documents {
|
||||
if let Some(&new_id) = self.file_id_map.get(&doc.file_id) {
|
||||
doc.file_id = new_id;
|
||||
}
|
||||
}
|
||||
|
||||
// Clone the map to avoid borrow conflicts
|
||||
let file_id_map = self.file_id_map.clone();
|
||||
|
||||
// Then, remap all FileRef references in the YAML
|
||||
for doc in &mut self.documents {
|
||||
Self::remap_value(&mut doc.yaml, &file_id_map);
|
||||
}
|
||||
}
|
||||
|
||||
/// Recursively traverse YAML and remap FileID references
|
||||
///
|
||||
/// Looks for patterns like `{fileID: 12345}` and replaces the FileID
|
||||
/// with the remapped value from `file_id_map`.
|
||||
fn remap_value(value: &mut Value, file_id_map: &HashMap<FileID, FileID>) {
|
||||
match value {
|
||||
Value::Mapping(map) => {
|
||||
// Check if this is a FileRef: {fileID: N}
|
||||
if let Some(file_id_value) = map.get(&Value::String("fileID".to_string())) {
|
||||
if let Some(num) = file_id_value.as_i64() {
|
||||
let original = FileID::from_i64(num);
|
||||
|
||||
// Remap if it's a local reference (not 0, not external)
|
||||
if num != 0 {
|
||||
if let Some(&new_id) = file_id_map.get(&original) {
|
||||
map.insert(
|
||||
Value::String("fileID".to_string()),
|
||||
Value::Number(new_id.as_i64().into()),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Recursively process all values in the mapping
|
||||
for (_, v) in map.iter_mut() {
|
||||
Self::remap_value(v, file_id_map);
|
||||
}
|
||||
}
|
||||
Value::Sequence(seq) => {
|
||||
// Recursively process array elements
|
||||
for item in seq.iter_mut() {
|
||||
Self::remap_value(item, file_id_map);
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
// Scalars (strings, numbers, bools, null) don't need remapping
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Override a YAML value before spawning
|
||||
///
|
||||
/// This allows modifying prefab data before instantiation. The override
|
||||
/// is applied to the document with the given FileID at the specified YAML path.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `file_id` - The original FileID (before remapping) of the document to modify
|
||||
/// * `path` - Dot-notation path to the value (e.g., "m_LocalPosition.x")
|
||||
/// * `value` - The new value to set
|
||||
///
|
||||
/// # Example
|
||||
/// ```ignore
|
||||
/// instance.override_value(file_id, "m_Name", "Player1".into())?;
|
||||
/// instance.override_value(file_id, "m_LocalPosition.x", 100.0.into())?;
|
||||
/// ```
|
||||
pub fn override_value(
|
||||
&mut self,
|
||||
file_id: FileID,
|
||||
path: &str,
|
||||
value: Value,
|
||||
) -> Result<()> {
|
||||
// Store override to be applied during spawn
|
||||
// Note: We store using the original FileID for easier API
|
||||
self.overrides.insert((file_id, path.to_string()), value);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Apply all stored overrides to the documents
|
||||
///
|
||||
/// This is called internally before spawning. It navigates to each
|
||||
/// override path and sets the new value.
|
||||
fn apply_overrides(&mut self) -> Result<()> {
|
||||
for ((file_id, path), value) in &self.overrides {
|
||||
// Find the document with this FileID (after remapping)
|
||||
let remapped_id = self
|
||||
.file_id_map
|
||||
.get(file_id)
|
||||
.ok_or_else(|| Error::reference_error(format!("FileID not found: {}", file_id)))?;
|
||||
|
||||
let doc = self
|
||||
.documents
|
||||
.iter_mut()
|
||||
.find(|d| d.file_id == *remapped_id)
|
||||
.ok_or_else(|| {
|
||||
Error::reference_error(format!("Document not found: {}", remapped_id))
|
||||
})?;
|
||||
|
||||
// Navigate to the path and set the value
|
||||
Self::set_yaml_value(&mut doc.yaml, path, value.clone())?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Navigate YAML path and set value
|
||||
///
|
||||
/// This parses the dot-notation path and navigates through the YAML
|
||||
/// structure to set the value at the target location.
|
||||
fn set_yaml_value(yaml: &mut Value, path: &str, new_value: Value) -> Result<()> {
|
||||
let segments = parse_yaml_path(path);
|
||||
let mut current = yaml;
|
||||
|
||||
// Navigate to parent of target
|
||||
for segment in &segments[..segments.len() - 1] {
|
||||
current = match segment {
|
||||
PathSegment::Field(field) => current
|
||||
.as_mapping_mut()
|
||||
.ok_or_else(|| Error::InvalidPropertyPath(path.to_string()))?
|
||||
.get_mut(&Value::String(field.clone()))
|
||||
.ok_or_else(|| Error::PropertyNotFound(field.clone()))?,
|
||||
PathSegment::ArrayIndex { field, index } => {
|
||||
let mapping = current
|
||||
.as_mapping_mut()
|
||||
.ok_or_else(|| Error::InvalidPropertyPath(path.to_string()))?;
|
||||
let array = mapping
|
||||
.get_mut(&Value::String(field.clone()))
|
||||
.and_then(|v| v.as_sequence_mut())
|
||||
.ok_or_else(|| Error::PropertyNotFound(field.clone()))?;
|
||||
array.get_mut(*index).ok_or_else(|| {
|
||||
Error::InvalidPropertyPath(format!("{}[{}]", field, index))
|
||||
})?
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Set the final value
|
||||
match segments.last().unwrap() {
|
||||
PathSegment::Field(field) => {
|
||||
let mapping = current
|
||||
.as_mapping_mut()
|
||||
.ok_or_else(|| Error::InvalidPropertyPath(path.to_string()))?;
|
||||
mapping.insert(Value::String(field.clone()), new_value);
|
||||
}
|
||||
PathSegment::ArrayIndex { field, index } => {
|
||||
let mapping = current
|
||||
.as_mapping_mut()
|
||||
.ok_or_else(|| Error::InvalidPropertyPath(path.to_string()))?;
|
||||
let array = mapping
|
||||
.get_mut(&Value::String(field.clone()))
|
||||
.and_then(|v| v.as_sequence_mut())
|
||||
.ok_or_else(|| Error::PropertyNotFound(field.clone()))?;
|
||||
if *index < array.len() {
|
||||
array[*index] = new_value;
|
||||
} else {
|
||||
return Err(Error::InvalidPropertyPath(format!(
|
||||
"{}[{}] out of bounds",
|
||||
field, index
|
||||
)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Spawn this prefab instance into an existing world
|
||||
///
|
||||
/// This applies any overrides and then uses the ECS builder to create
|
||||
/// entities and components in the target world.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `world` - The Sparsey ECS world to spawn entities into
|
||||
/// * `entity_map` - HashMap to track FileID → Entity mappings
|
||||
///
|
||||
/// # Returns
|
||||
/// Vec of newly created entities
|
||||
///
|
||||
/// # Example
|
||||
/// ```ignore
|
||||
/// let entities = instance.spawn_into(&mut scene.world, &mut scene.entity_map)?;
|
||||
/// println!("Spawned {} entities", entities.len());
|
||||
/// ```
|
||||
pub fn spawn_into(
|
||||
mut self,
|
||||
world: &mut World,
|
||||
entity_map: &mut HashMap<FileID, Entity>,
|
||||
) -> Result<Vec<Entity>> {
|
||||
// Apply overrides before spawning
|
||||
self.apply_overrides()?;
|
||||
|
||||
// Spawn into existing world using the builder
|
||||
crate::ecs::build_world_from_documents_into(self.documents, world, entity_map)
|
||||
}
|
||||
|
||||
/// Get the source prefab path (for debugging)
|
||||
pub fn source_path(&self) -> &PathBuf {
|
||||
&self.source_path
|
||||
}
|
||||
|
||||
/// Get the FileID mapping table (for debugging)
|
||||
pub fn file_id_map(&self) -> &HashMap<FileID, FileID> {
|
||||
&self.file_id_map
|
||||
}
|
||||
}
|
||||
|
||||
/// Unity component representing a reference to another prefab (nested prefab)
|
||||
///
|
||||
/// This component appears in prefabs that contain instances of other prefabs.
|
||||
/// It stores the GUID of the referenced prefab and any modifications applied.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct PrefabInstanceComponent {
|
||||
/// External reference to the source prefab (by GUID)
|
||||
pub prefab_ref: ExternalRef,
|
||||
|
||||
/// Modifications applied to the nested prefab
|
||||
pub modifications: Vec<PrefabModification>,
|
||||
}
|
||||
|
||||
impl UnityComponent for PrefabInstanceComponent {
|
||||
fn parse(yaml: &Mapping, _ctx: &ComponentContext) -> Option<Self> {
|
||||
// Extract m_SourcePrefab (external GUID reference)
|
||||
let prefab_ref = yaml_helpers::get_external_ref(yaml, "m_SourcePrefab")?;
|
||||
|
||||
// Extract m_Modification array (if any)
|
||||
let modifications = parse_modifications(yaml).unwrap_or_default();
|
||||
|
||||
Some(Self {
|
||||
prefab_ref,
|
||||
modifications,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A modification applied to a nested prefab
|
||||
///
|
||||
/// Unity stores modifications as changes to specific properties of objects
|
||||
/// within the nested prefab.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct PrefabModification {
|
||||
/// The FileID of the target object within the nested prefab
|
||||
pub target_file_id: FileID,
|
||||
|
||||
/// The property path to modify (dot notation)
|
||||
pub property_path: String,
|
||||
|
||||
/// The new value to apply
|
||||
pub value: Value,
|
||||
}
|
||||
|
||||
/// Parse modifications array from Unity YAML
|
||||
fn parse_modifications(yaml: &Mapping) -> Option<Vec<PrefabModification>> {
|
||||
let mods_array = yaml
|
||||
.get(&Value::String("m_Modification".to_string()))
|
||||
.and_then(|v| v.as_sequence())?;
|
||||
|
||||
let mut mods = Vec::new();
|
||||
for mod_yaml in mods_array {
|
||||
if let Some(mod_map) = mod_yaml.as_mapping() {
|
||||
// Parse target FileID, property path, and value
|
||||
// Unity format: {target: {fileID: N}, propertyPath: "m_Name", value: "NewName"}
|
||||
if let Some(modification) = parse_single_modification(mod_map) {
|
||||
mods.push(modification);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Some(mods)
|
||||
}
|
||||
|
||||
/// Parse a single modification entry
|
||||
fn parse_single_modification(yaml: &Mapping) -> Option<PrefabModification> {
|
||||
// Get target FileID
|
||||
let target = yaml
|
||||
.get(&Value::String("target".to_string()))
|
||||
.and_then(|v| v.as_mapping())?;
|
||||
let target_file_id = yaml_helpers::get_file_ref_from_mapping(target)?.file_id;
|
||||
|
||||
// Get property path
|
||||
let property_path = yaml
|
||||
.get(&Value::String("propertyPath".to_string()))
|
||||
.and_then(|v| v.as_str())?
|
||||
.to_string();
|
||||
|
||||
// Get value
|
||||
let value = yaml
|
||||
.get(&Value::String("value".to_string()))?
|
||||
.clone();
|
||||
|
||||
Some(PrefabModification {
|
||||
target_file_id,
|
||||
property_path,
|
||||
value,
|
||||
})
|
||||
}
|
||||
|
||||
/// Resolver for loading and recursively instantiating prefabs
|
||||
///
|
||||
/// PrefabResolver handles:
|
||||
/// - Loading prefabs by GUID
|
||||
/// - Caching loaded prefabs
|
||||
/// - Detecting circular prefab references
|
||||
/// - Recursively instantiating nested prefabs
|
||||
pub struct PrefabResolver {
|
||||
/// Cache of loaded prefabs (GUID → Prefab)
|
||||
prefab_cache: HashMap<String, Arc<UnityPrefab>>,
|
||||
|
||||
/// Mapping from GUID to file path
|
||||
guid_to_path: HashMap<String, PathBuf>,
|
||||
|
||||
/// Stack of GUIDs currently being instantiated (for cycle detection)
|
||||
instantiation_stack: Vec<String>,
|
||||
}
|
||||
|
||||
impl PrefabResolver {
|
||||
/// Create a new PrefabResolver
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `guid_to_path` - Mapping from asset GUID to file path
|
||||
pub fn new(guid_to_path: HashMap<String, PathBuf>) -> Self {
|
||||
Self {
|
||||
prefab_cache: HashMap::new(),
|
||||
guid_to_path,
|
||||
instantiation_stack: Vec::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Recursively instantiate a prefab and its nested prefabs
|
||||
///
|
||||
/// This handles:
|
||||
/// 1. Checking for circular references
|
||||
/// 2. Creating a prefab instance
|
||||
/// 3. Finding any nested prefab references
|
||||
/// 4. Recursively instantiating nested prefabs
|
||||
/// 5. Spawning the prefab's entities into the world
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `prefab` - The prefab to instantiate
|
||||
/// * `world` - The ECS world to spawn entities into
|
||||
/// * `entity_map` - Entity mapping to update
|
||||
///
|
||||
/// # Returns
|
||||
/// Vec of spawned entities
|
||||
pub fn instantiate_recursive(
|
||||
&mut self,
|
||||
prefab: &UnityPrefab,
|
||||
world: &mut World,
|
||||
entity_map: &mut HashMap<FileID, Entity>,
|
||||
) -> Result<Vec<Entity>> {
|
||||
// For this implementation, we'll use the path as the identifier
|
||||
// In a full implementation, we'd extract GUID from .meta files
|
||||
let prefab_id = prefab.path.to_string_lossy().to_string();
|
||||
|
||||
// Check for circular references
|
||||
if self.instantiation_stack.contains(&prefab_id) {
|
||||
return Err(Error::circular_reference());
|
||||
}
|
||||
|
||||
// Push to stack
|
||||
self.instantiation_stack.push(prefab_id.clone());
|
||||
|
||||
// Create instance
|
||||
let instance = prefab.instantiate();
|
||||
|
||||
// Find nested prefab references
|
||||
let nested_prefabs = self.find_nested_prefabs(&instance)?;
|
||||
|
||||
// For each nested prefab, recursively instantiate it
|
||||
// (This is a simplified version - full implementation would need to
|
||||
// properly link nested entities to parent GameObjects)
|
||||
for (_parent_file_id, nested_component) in nested_prefabs {
|
||||
// Load the referenced prefab
|
||||
if let Ok(nested_prefab) = self.load_prefab(&nested_component.prefab_ref.guid) {
|
||||
// Apply modifications
|
||||
let mut nested_instance = nested_prefab.instantiate();
|
||||
for modification in &nested_component.modifications {
|
||||
nested_instance.override_value(
|
||||
modification.target_file_id,
|
||||
&modification.property_path,
|
||||
modification.value.clone(),
|
||||
)?;
|
||||
}
|
||||
|
||||
// Recursively spawn nested prefab
|
||||
self.instantiate_recursive(&nested_prefab, world, entity_map)?;
|
||||
}
|
||||
}
|
||||
|
||||
// Spawn this prefab's entities
|
||||
let spawned = instance.spawn_into(world, entity_map)?;
|
||||
|
||||
// Pop from stack
|
||||
self.instantiation_stack.pop();
|
||||
|
||||
Ok(spawned)
|
||||
}
|
||||
|
||||
/// Find all nested prefab references in an instance
|
||||
fn find_nested_prefabs(
|
||||
&self,
|
||||
instance: &PrefabInstance,
|
||||
) -> Result<Vec<(FileID, PrefabInstanceComponent)>> {
|
||||
let mut nested = Vec::new();
|
||||
|
||||
for doc in &instance.documents {
|
||||
if doc.class_name == "PrefabInstance" {
|
||||
if let Some(mapping) = doc.as_mapping() {
|
||||
// Create a minimal context for parsing
|
||||
let ctx = ComponentContext {
|
||||
type_id: doc.type_id,
|
||||
file_id: doc.file_id,
|
||||
class_name: &doc.class_name,
|
||||
entity: None,
|
||||
linking_ctx: None,
|
||||
yaml: mapping,
|
||||
};
|
||||
|
||||
if let Some(component) = PrefabInstanceComponent::parse(mapping, &ctx) {
|
||||
nested.push((doc.file_id, component));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(nested)
|
||||
}
|
||||
|
||||
/// Load a prefab by GUID
|
||||
fn load_prefab(&mut self, guid: &str) -> Result<Arc<UnityPrefab>> {
|
||||
// Check cache first
|
||||
if let Some(prefab) = self.prefab_cache.get(guid) {
|
||||
return Ok(prefab.clone());
|
||||
}
|
||||
|
||||
// Resolve GUID to path
|
||||
let path = self
|
||||
.guid_to_path
|
||||
.get(guid)
|
||||
.ok_or_else(|| Error::guid_resolution_error(format!("GUID not found: {}", guid)))?
|
||||
.clone();
|
||||
|
||||
// Load prefab
|
||||
let unity_file = crate::model::UnityFile::from_path(&path)?;
|
||||
let prefab = match unity_file {
|
||||
crate::model::UnityFile::Prefab(p) => Arc::new(p),
|
||||
_ => {
|
||||
return Err(Error::invalid_format(
|
||||
"Expected prefab file for GUID resolution",
|
||||
))
|
||||
}
|
||||
};
|
||||
|
||||
// Cache for future use
|
||||
self.prefab_cache.insert(guid.to_string(), prefab.clone());
|
||||
|
||||
Ok(prefab)
|
||||
}
|
||||
}
|
||||
|
||||
/// A segment of a YAML property path
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
enum PathSegment {
|
||||
/// A simple field access (e.g., "m_Name")
|
||||
Field(String),
|
||||
/// An array element access (e.g., "m_Component[0]")
|
||||
ArrayIndex { field: String, index: usize },
|
||||
}
|
||||
|
||||
/// Parse a YAML path into segments
|
||||
///
|
||||
/// Supports dot notation and array indices:
|
||||
/// - "m_Name" → [Field("m_Name")]
|
||||
/// - "m_LocalPosition.x" → [Field("m_LocalPosition"), Field("x")]
|
||||
/// - "m_Component[0].fileID" → [ArrayIndex{field: "m_Component", index: 0}, Field("fileID")]
|
||||
fn parse_yaml_path(path: &str) -> Vec<PathSegment> {
|
||||
path.split('.')
|
||||
.map(|segment| {
|
||||
// Check if it's an array index like "m_Component[0]"
|
||||
if let Some(idx_start) = segment.find('[') {
|
||||
let field = &segment[..idx_start];
|
||||
if let Some(idx_end) = segment.find(']') {
|
||||
if let Ok(index) = segment[idx_start + 1..idx_end].parse::<usize>() {
|
||||
return PathSegment::ArrayIndex {
|
||||
field: field.to_string(),
|
||||
index,
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
PathSegment::Field(segment.to_string())
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_file_id_generation() {
|
||||
let mut instance = PrefabInstance {
|
||||
documents: Vec::new(),
|
||||
file_id_map: HashMap::new(),
|
||||
overrides: HashMap::new(),
|
||||
next_file_id: i64::MAX,
|
||||
source_path: PathBuf::from("test.prefab"),
|
||||
};
|
||||
|
||||
let id1 = instance.generate_file_id();
|
||||
let id2 = instance.generate_file_id();
|
||||
|
||||
// Should decrement
|
||||
assert!(id1.as_i64() > id2.as_i64());
|
||||
assert_eq!(id1.as_i64() - 1, id2.as_i64());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_file_id_generation_starts_at_max() {
|
||||
let mut instance = PrefabInstance {
|
||||
documents: Vec::new(),
|
||||
file_id_map: HashMap::new(),
|
||||
overrides: HashMap::new(),
|
||||
next_file_id: i64::MAX,
|
||||
source_path: PathBuf::from("test.prefab"),
|
||||
};
|
||||
|
||||
let id1 = instance.generate_file_id();
|
||||
assert_eq!(id1.as_i64(), i64::MAX);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_yaml_path_parsing_simple() {
|
||||
let path = "m_Name";
|
||||
let segments = parse_yaml_path(path);
|
||||
assert_eq!(segments.len(), 1);
|
||||
assert_eq!(segments[0], PathSegment::Field("m_Name".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_yaml_path_parsing_nested() {
|
||||
let path = "m_LocalPosition.x";
|
||||
let segments = parse_yaml_path(path);
|
||||
assert_eq!(segments.len(), 2);
|
||||
assert_eq!(
|
||||
segments[0],
|
||||
PathSegment::Field("m_LocalPosition".to_string())
|
||||
);
|
||||
assert_eq!(segments[1], PathSegment::Field("x".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_yaml_path_parsing_array() {
|
||||
let path = "m_Component[0]";
|
||||
let segments = parse_yaml_path(path);
|
||||
assert_eq!(segments.len(), 1);
|
||||
assert_eq!(
|
||||
segments[0],
|
||||
PathSegment::ArrayIndex {
|
||||
field: "m_Component".to_string(),
|
||||
index: 0
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_yaml_path_parsing_array_with_field() {
|
||||
let path = "m_Component[0].fileID";
|
||||
let segments = parse_yaml_path(path);
|
||||
assert_eq!(segments.len(), 2);
|
||||
assert_eq!(
|
||||
segments[0],
|
||||
PathSegment::ArrayIndex {
|
||||
field: "m_Component".to_string(),
|
||||
index: 0
|
||||
}
|
||||
);
|
||||
assert_eq!(segments[1], PathSegment::Field("fileID".to_string()));
|
||||
}
|
||||
}
|
||||
@@ -1,162 +1,402 @@
|
||||
//! Integration tests for parsing real Unity projects
|
||||
|
||||
use cursebreaker_parser::UnityFile;
|
||||
use std::path::Path;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
use std::time::Instant;
|
||||
|
||||
#[test]
|
||||
fn test_parse_cardgrabber_prefab() {
|
||||
let path = Path::new("data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Hand/CardGrabber.prefab");
|
||||
|
||||
// Skip if the file doesn't exist (CI/CD might not have submodules)
|
||||
if !path.exists() {
|
||||
eprintln!("Skipping test: file not found at {:?}", path);
|
||||
return;
|
||||
}
|
||||
|
||||
let file = UnityFile::from_path(path).expect("Failed to parse CardGrabber.prefab");
|
||||
|
||||
// Verify we parsed multiple documents
|
||||
assert!(file.documents.len() > 0, "Should have at least one document");
|
||||
|
||||
// Find the GameObject
|
||||
let game_objects = file.get_documents_by_class("GameObject");
|
||||
assert!(!game_objects.is_empty(), "Should have at least one GameObject");
|
||||
|
||||
let game_object = game_objects[0];
|
||||
assert_eq!(game_object.type_id, 1, "GameObject should have type ID 1");
|
||||
|
||||
// Verify the name property exists
|
||||
if let Some(go_props) = game_object.get("GameObject") {
|
||||
if let Some(props) = go_props.as_object() {
|
||||
let has_name = props.contains_key("m_Name");
|
||||
assert!(has_name, "GameObject should have m_Name property");
|
||||
}
|
||||
}
|
||||
|
||||
// Find RectTransform
|
||||
let transforms = file.get_documents_by_class("RectTransform");
|
||||
assert!(!transforms.is_empty(), "Should have at least one RectTransform");
|
||||
|
||||
let transform = transforms[0];
|
||||
assert_eq!(transform.type_id, 224, "RectTransform should have type ID 224");
|
||||
/// Test project configuration
|
||||
struct TestProject {
|
||||
name: &'static str,
|
||||
repo_url: &'static str,
|
||||
branch: Option<&'static str>,
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_scene_file() {
|
||||
let path = Path::new("data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Scenes/Scene01MainMenu.unity");
|
||||
impl TestProject {
|
||||
const VR_HORROR: TestProject = TestProject {
|
||||
name: "VR_Horror_YouCantRun",
|
||||
repo_url: "https://github.com/Unity3D-Projects/VR_Horror_YouCantRun.git",
|
||||
branch: None,
|
||||
};
|
||||
|
||||
// Skip if the file doesn't exist
|
||||
if !path.exists() {
|
||||
eprintln!("Skipping test: file not found at {:?}", path);
|
||||
return;
|
||||
}
|
||||
|
||||
let file = UnityFile::from_path(path).expect("Failed to parse Scene01MainMenu.unity");
|
||||
|
||||
// Scenes typically have many documents
|
||||
assert!(file.documents.len() > 10, "Scene should have many documents");
|
||||
|
||||
// Should have GameObjects
|
||||
let game_objects = file.get_documents_by_class("GameObject");
|
||||
assert!(!game_objects.is_empty(), "Scene should have GameObjects");
|
||||
|
||||
println!("Parsed {} documents from scene", file.documents.len());
|
||||
println!("Found {} GameObjects", game_objects.len());
|
||||
const PIRATE_PANIC: TestProject = TestProject {
|
||||
name: "PiratePanic",
|
||||
repo_url: "https://github.com/Unity-Technologies/PiratePanic.git",
|
||||
branch: None,
|
||||
};
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_multiple_prefabs() {
|
||||
let prefab_paths = [
|
||||
"data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Hand/CostPanel.prefab",
|
||||
"data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Hand/GoldPanel.prefab",
|
||||
"data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Map/Node.prefab",
|
||||
];
|
||||
/// Statistics gathered during parsing
|
||||
#[derive(Debug, Default)]
|
||||
struct ParsingStats {
|
||||
total_files: usize,
|
||||
scenes: usize,
|
||||
prefabs: usize,
|
||||
assets: usize,
|
||||
errors: Vec<(PathBuf, String)>,
|
||||
total_entities: usize,
|
||||
total_documents: usize,
|
||||
parse_time_ms: u128,
|
||||
}
|
||||
|
||||
let mut total_documents = 0;
|
||||
impl ParsingStats {
|
||||
fn print_summary(&self) {
|
||||
println!("\n{}", "=".repeat(60));
|
||||
println!("Parsing Statistics");
|
||||
println!("{}", "=".repeat(60));
|
||||
println!(" Total files found: {}", self.total_files);
|
||||
println!(" Scenes parsed: {}", self.scenes);
|
||||
println!(" Prefabs parsed: {}", self.prefabs);
|
||||
println!(" Assets parsed: {}", self.assets);
|
||||
println!(" Total entities: {}", self.total_entities);
|
||||
println!(" Total documents: {}", self.total_documents);
|
||||
println!(" Parse time: {} ms", self.parse_time_ms);
|
||||
|
||||
for path_str in &prefab_paths {
|
||||
let path = Path::new(path_str);
|
||||
|
||||
if !path.exists() {
|
||||
eprintln!("Skipping test: file not found at {:?}", path);
|
||||
continue;
|
||||
}
|
||||
|
||||
match UnityFile::from_path(path) {
|
||||
Ok(file) => {
|
||||
assert!(file.documents.len() > 0, "File {:?} should have documents", path);
|
||||
total_documents += file.documents.len();
|
||||
println!("Parsed {:?}: {} documents", path.file_name().unwrap(), file.documents.len());
|
||||
if !self.errors.is_empty() {
|
||||
println!("\n Errors encountered: {}", self.errors.len());
|
||||
println!("\n Error details:");
|
||||
for (path, error) in self.errors.iter().take(10) {
|
||||
println!(" - {}", path.display());
|
||||
println!(" Error: {}", error);
|
||||
}
|
||||
if self.errors.len() > 10 {
|
||||
println!(" ... and {} more errors", self.errors.len() - 10);
|
||||
}
|
||||
}
|
||||
|
||||
let success_rate = if self.total_files > 0 {
|
||||
((self.total_files - self.errors.len()) as f64 / self.total_files as f64) * 100.0
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
println!("\n Success rate: {:.2}%", success_rate);
|
||||
println!("{}", "=".repeat(60));
|
||||
}
|
||||
}
|
||||
|
||||
/// Clone a git repository for testing
|
||||
fn clone_test_project(project: &TestProject) -> std::io::Result<PathBuf> {
|
||||
let test_data_dir = PathBuf::from("test_data");
|
||||
std::fs::create_dir_all(&test_data_dir)?;
|
||||
|
||||
let project_path = test_data_dir.join(project.name);
|
||||
|
||||
// Skip if already cloned
|
||||
if project_path.exists() {
|
||||
println!("Project already cloned at: {}", project_path.display());
|
||||
return Ok(project_path);
|
||||
}
|
||||
|
||||
println!("Cloning {} from {}...", project.name, project.repo_url);
|
||||
|
||||
let mut cmd = Command::new("git");
|
||||
cmd.arg("clone");
|
||||
|
||||
if let Some(branch) = project.branch {
|
||||
cmd.arg("--branch").arg(branch);
|
||||
}
|
||||
|
||||
cmd.arg("--depth").arg("1"); // Shallow clone for speed
|
||||
cmd.arg(project.repo_url);
|
||||
cmd.arg(&project_path);
|
||||
|
||||
let output = cmd.output()?;
|
||||
|
||||
if !output.status.success() {
|
||||
eprintln!("Git clone failed: {}", String::from_utf8_lossy(&output.stderr));
|
||||
return Err(std::io::Error::new(
|
||||
std::io::ErrorKind::Other,
|
||||
"Git clone failed",
|
||||
));
|
||||
}
|
||||
|
||||
println!("Successfully cloned to: {}", project_path.display());
|
||||
Ok(project_path)
|
||||
}
|
||||
|
||||
/// Recursively find all Unity files in a directory
|
||||
fn find_unity_files(dir: &Path) -> Vec<PathBuf> {
|
||||
let mut files = Vec::new();
|
||||
|
||||
if !dir.exists() || !dir.is_dir() {
|
||||
return files;
|
||||
}
|
||||
|
||||
fn visit_dir(dir: &Path, files: &mut Vec<PathBuf>) {
|
||||
if let Ok(entries) = std::fs::read_dir(dir) {
|
||||
for entry in entries.flatten() {
|
||||
let path = entry.path();
|
||||
|
||||
// Skip common Unity directories that don't contain source assets
|
||||
if let Some(name) = path.file_name().and_then(|n| n.to_str()) {
|
||||
if name == "Library" || name == "Temp" || name == "Builds" || name == ".git" {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if path.is_dir() {
|
||||
visit_dir(&path, files);
|
||||
} else if let Some(ext) = path.extension().and_then(|e| e.to_str()) {
|
||||
if ext == "unity" || ext == "prefab" || ext == "asset" {
|
||||
files.push(path);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
visit_dir(dir, &mut files);
|
||||
files
|
||||
}
|
||||
|
||||
/// Parse all Unity files in a project and collect statistics
|
||||
fn parse_project(project_path: &Path) -> ParsingStats {
|
||||
let mut stats = ParsingStats::default();
|
||||
|
||||
println!("\nFinding Unity files in {}...", project_path.display());
|
||||
let files = find_unity_files(project_path);
|
||||
stats.total_files = files.len();
|
||||
|
||||
println!("Found {} Unity files", files.len());
|
||||
println!("\nParsing files...");
|
||||
|
||||
let start_time = Instant::now();
|
||||
|
||||
for (i, file_path) in files.iter().enumerate() {
|
||||
// Print progress
|
||||
if (i + 1) % 10 == 0 || i == 0 {
|
||||
println!(
|
||||
" [{}/{}] Parsing: {}",
|
||||
i + 1,
|
||||
files.len(),
|
||||
file_path.file_name().unwrap().to_string_lossy()
|
||||
);
|
||||
}
|
||||
|
||||
match UnityFile::from_path(file_path) {
|
||||
Ok(unity_file) => match unity_file {
|
||||
UnityFile::Scene(scene) => {
|
||||
stats.scenes += 1;
|
||||
stats.total_entities += scene.entity_map.len();
|
||||
}
|
||||
UnityFile::Prefab(prefab) => {
|
||||
stats.prefabs += 1;
|
||||
stats.total_documents += prefab.documents.len();
|
||||
}
|
||||
UnityFile::Asset(asset) => {
|
||||
stats.assets += 1;
|
||||
stats.total_documents += asset.documents.len();
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
panic!("Failed to parse {:?}: {}", path, e);
|
||||
stats.errors.push((file_path.clone(), e.to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if total_documents > 0 {
|
||||
assert!(total_documents > 3, "Should have parsed multiple documents across files");
|
||||
stats.parse_time_ms = start_time.elapsed().as_millis();
|
||||
stats
|
||||
}
|
||||
|
||||
/// Test parsing a specific project
|
||||
fn test_project(project: &TestProject) {
|
||||
println!("\n{}", "=".repeat(60));
|
||||
println!("Testing: {}", project.name);
|
||||
println!("{}", "=".repeat(60));
|
||||
|
||||
// Clone the project
|
||||
let project_path = match clone_test_project(project) {
|
||||
Ok(path) => path,
|
||||
Err(e) => {
|
||||
eprintln!("Failed to clone project: {}", e);
|
||||
eprintln!("Skipping project test (git may not be available)");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
// Parse all files
|
||||
let stats = parse_project(&project_path);
|
||||
|
||||
// Print summary
|
||||
stats.print_summary();
|
||||
|
||||
// Assert basic expectations
|
||||
assert!(
|
||||
stats.total_files > 0,
|
||||
"Should find at least some Unity files"
|
||||
);
|
||||
|
||||
// Allow some errors but not too many
|
||||
let error_rate = if stats.total_files > 0 {
|
||||
(stats.errors.len() as f64 / stats.total_files as f64) * 100.0
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
|
||||
if error_rate > 50.0 {
|
||||
panic!(
|
||||
"Error rate too high: {:.2}% ({}/{})",
|
||||
error_rate, stats.errors.len(), stats.total_files
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
/// Test detailed parsing of specific file types
|
||||
fn test_detailed_parsing(project_path: &Path) {
|
||||
println!("\n{}", "=".repeat(60));
|
||||
println!("Detailed Parsing Tests");
|
||||
println!("{}", "=".repeat(60));
|
||||
|
||||
let files = find_unity_files(project_path);
|
||||
|
||||
// Test scene parsing
|
||||
if let Some(scene_file) = files.iter().find(|f| {
|
||||
f.extension()
|
||||
.and_then(|e| e.to_str())
|
||||
.map_or(false, |e| e == "unity")
|
||||
}) {
|
||||
println!(
|
||||
"\nTesting scene parsing: {}",
|
||||
scene_file.file_name().unwrap().to_string_lossy()
|
||||
);
|
||||
match UnityFile::from_path(scene_file) {
|
||||
Ok(UnityFile::Scene(scene)) => {
|
||||
println!(" ✓ Successfully parsed scene");
|
||||
println!(" - Entities: {}", scene.entity_map.len());
|
||||
println!(" - Path: {}", scene.path.display());
|
||||
|
||||
// Try to access entities
|
||||
for (file_id, entity) in scene.entity_map.iter().take(3) {
|
||||
println!(" - FileID {} -> Entity {:?}", file_id, entity);
|
||||
}
|
||||
}
|
||||
Ok(_) => println!(" ✗ File was not parsed as scene"),
|
||||
Err(e) => println!(" ✗ Parse error: {}", e),
|
||||
}
|
||||
}
|
||||
|
||||
// Test prefab parsing and instancing
|
||||
if let Some(prefab_file) = files.iter().find(|f| {
|
||||
f.extension()
|
||||
.and_then(|e| e.to_str())
|
||||
.map_or(false, |e| e == "prefab")
|
||||
}) {
|
||||
println!(
|
||||
"\nTesting prefab parsing: {}",
|
||||
prefab_file.file_name().unwrap().to_string_lossy()
|
||||
);
|
||||
match UnityFile::from_path(prefab_file) {
|
||||
Ok(UnityFile::Prefab(prefab)) => {
|
||||
println!(" ✓ Successfully parsed prefab");
|
||||
println!(" - Documents: {}", prefab.documents.len());
|
||||
println!(" - Path: {}", prefab.path.display());
|
||||
|
||||
// Test instantiation
|
||||
println!("\n Testing prefab instantiation:");
|
||||
let instance = prefab.instantiate();
|
||||
println!(
|
||||
" ✓ Created instance with {} remapped FileIDs",
|
||||
instance.file_id_map().len()
|
||||
);
|
||||
|
||||
// Test override system
|
||||
if let Some(first_doc) = prefab.documents.first() {
|
||||
let mut instance2 = prefab.instantiate();
|
||||
let result = instance2.override_value(
|
||||
first_doc.file_id,
|
||||
"m_Name",
|
||||
serde_yaml::Value::String("TestName".to_string()),
|
||||
);
|
||||
if result.is_ok() {
|
||||
println!(" ✓ Override system working");
|
||||
} else {
|
||||
println!(" - Override test: {}", result.unwrap_err());
|
||||
}
|
||||
}
|
||||
|
||||
// List document types
|
||||
let mut type_counts = std::collections::HashMap::new();
|
||||
for doc in &prefab.documents {
|
||||
*type_counts.entry(&doc.class_name).or_insert(0) += 1;
|
||||
}
|
||||
println!(" - Component types:");
|
||||
for (class_name, count) in type_counts.iter() {
|
||||
println!(" - {}: {}", class_name, count);
|
||||
}
|
||||
}
|
||||
Ok(_) => println!(" ✗ File was not parsed as prefab"),
|
||||
Err(e) => println!(" ✗ Parse error: {}", e),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_file_id_lookup() {
|
||||
let path = Path::new("data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Hand/CardGrabber.prefab");
|
||||
|
||||
if !path.exists() {
|
||||
eprintln!("Skipping test: file not found at {:?}", path);
|
||||
return;
|
||||
}
|
||||
|
||||
let file = UnityFile::from_path(path).expect("Failed to parse file");
|
||||
|
||||
// Get the first document's file ID
|
||||
if let Some(first_doc) = file.documents.first() {
|
||||
let file_id = first_doc.file_id;
|
||||
|
||||
// Look it up
|
||||
let found = file.get_document(file_id);
|
||||
assert!(found.is_some(), "Should be able to find document by file ID");
|
||||
assert_eq!(found.unwrap().file_id, file_id, "Found document should have correct file ID");
|
||||
}
|
||||
fn test_vr_horror_project() {
|
||||
test_project(&TestProject::VR_HORROR);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_get_documents_by_type() {
|
||||
let path = Path::new("data/tests/unity-sampleproject/PiratePanic/Assets/PiratePanic/Prefabs/Menu/Battle/Hand/CardGrabber.prefab");
|
||||
|
||||
if !path.exists() {
|
||||
eprintln!("Skipping test: file not found at {:?}", path);
|
||||
return;
|
||||
}
|
||||
|
||||
let file = UnityFile::from_path(path).expect("Failed to parse file");
|
||||
|
||||
// Get all GameObjects (type ID 1)
|
||||
let game_objects = file.get_documents_by_type(1);
|
||||
assert!(!game_objects.is_empty(), "Should find GameObjects by type ID");
|
||||
|
||||
// Verify they're actually GameObjects
|
||||
for go in game_objects {
|
||||
assert_eq!(go.type_id, 1, "All returned documents should have type ID 1");
|
||||
assert!(go.is_game_object(), "Document should be identified as GameObject");
|
||||
}
|
||||
#[ignore] // Ignore by default, run with --ignored to test
|
||||
fn test_pirate_panic_project() {
|
||||
test_project(&TestProject::PIRATE_PANIC);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_error_handling_invalid_file() {
|
||||
let result = UnityFile::from_path("nonexistent_file.unity");
|
||||
assert!(result.is_err(), "Should return error for nonexistent file");
|
||||
fn test_vr_horror_detailed() {
|
||||
let project_path = match clone_test_project(&TestProject::VR_HORROR) {
|
||||
Ok(path) => path,
|
||||
Err(e) => {
|
||||
eprintln!("Failed to clone project: {}", e);
|
||||
eprintln!("Skipping detailed test (git may not be available)");
|
||||
return;
|
||||
}
|
||||
};
|
||||
test_detailed_parsing(&project_path);
|
||||
}
|
||||
|
||||
/// Benchmark parsing performance
|
||||
#[test]
|
||||
fn test_error_handling_invalid_format() {
|
||||
// Create a temporary file with invalid content
|
||||
let temp_dir = std::env::temp_dir();
|
||||
let temp_file = temp_dir.join("invalid_unity_file.unity");
|
||||
std::fs::write(&temp_file, "This is not a Unity file").expect("Failed to write temp file");
|
||||
#[ignore]
|
||||
fn benchmark_parsing() {
|
||||
let project_path = match clone_test_project(&TestProject::VR_HORROR) {
|
||||
Ok(path) => path,
|
||||
Err(_) => {
|
||||
eprintln!("Skipping benchmark (git not available)");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
let result = UnityFile::from_path(&temp_file);
|
||||
assert!(result.is_err(), "Should return error for invalid Unity file format");
|
||||
println!("\n{}", "=".repeat(60));
|
||||
println!("Parsing Performance Benchmark");
|
||||
println!("{}", "=".repeat(60));
|
||||
|
||||
// Clean up
|
||||
let _ = std::fs::remove_file(&temp_file);
|
||||
let files = find_unity_files(&project_path);
|
||||
let total_size: u64 = files
|
||||
.iter()
|
||||
.filter_map(|f| std::fs::metadata(f).ok())
|
||||
.map(|m| m.len())
|
||||
.sum();
|
||||
|
||||
println!("Total files: {}", files.len());
|
||||
println!("Total size: {} KB", total_size / 1024);
|
||||
|
||||
let start = Instant::now();
|
||||
let stats = parse_project(&project_path);
|
||||
let elapsed = start.elapsed();
|
||||
|
||||
println!("\nParsing completed in {:?}", elapsed);
|
||||
println!(
|
||||
"Average time per file: {:.2} ms",
|
||||
elapsed.as_millis() as f64 / files.len() as f64
|
||||
);
|
||||
println!(
|
||||
"Throughput: {:.2} files/sec",
|
||||
files.len() as f64 / elapsed.as_secs_f64()
|
||||
);
|
||||
println!(
|
||||
"Throughput: {:.2} KB/sec",
|
||||
(total_size / 1024) as f64 / elapsed.as_secs_f64()
|
||||
);
|
||||
|
||||
stats.print_summary();
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user