restructure + types
This commit is contained in:
@@ -1,627 +0,0 @@
|
||||
# Prefab Instantiation Deep Dive
|
||||
|
||||
This document explains how prefabs and nested prefabs are instantiated in the Unity Parser.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Core Concepts](#core-concepts)
|
||||
2. [Prefab Representation](#prefab-representation)
|
||||
3. [Simple Prefab Instantiation](#simple-prefab-instantiation)
|
||||
4. [Nested Prefab System](#nested-prefab-system)
|
||||
5. [The 4-Pass ECS Building Process](#the-4-pass-ecs-building-process)
|
||||
6. [FileID Remapping](#fileid-remapping)
|
||||
7. [Code Examples](#code-examples)
|
||||
|
||||
---
|
||||
|
||||
## Core Concepts
|
||||
|
||||
### What is a Prefab?
|
||||
|
||||
In Unity, a **prefab** is a reusable template that can be instantiated multiple times in scenes. Think of it like a blueprint:
|
||||
- A prefab file (`.prefab`) contains GameObjects and Components as YAML
|
||||
- When placed in a scene, Unity creates an **instance** of that prefab
|
||||
- Each instance can have **modifications** (overrides) applied to it
|
||||
|
||||
### What is a Nested Prefab?
|
||||
|
||||
A **nested prefab** is a prefab that contains instances of other prefabs within it. For example:
|
||||
- `Player.prefab` might contain a nested `Weapon.prefab`
|
||||
- When you instantiate `Player.prefab`, it must also instantiate `Weapon.prefab`
|
||||
- This can go multiple levels deep
|
||||
|
||||
### Key Design Decision: Why Raw YAML?
|
||||
|
||||
Prefabs are stored as **raw YAML documents** (`UnityPrefab`) rather than fully-parsed ECS worlds (`UnityScene`). This is because:
|
||||
|
||||
1. **Efficient Cloning**: Prefabs need to be instantiated multiple times with different values
|
||||
2. **YAML Overrides**: Unity stores modifications as YAML property path overrides (e.g., `m_LocalPosition.x = 100`)
|
||||
3. **FileID Remapping**: Each instance needs unique FileIDs to avoid collisions
|
||||
|
||||
Scenes, on the other hand, are parsed directly into Sparsey ECS worlds since they don't need cloning.
|
||||
|
||||
---
|
||||
|
||||
## Prefab Representation
|
||||
|
||||
### UnityPrefab Structure
|
||||
|
||||
From `unity-parser/src/model/mod.rs:93-146`:
|
||||
|
||||
```rust
|
||||
pub struct UnityPrefab {
|
||||
/// Path to the prefab file
|
||||
pub path: PathBuf,
|
||||
|
||||
/// Raw YAML documents that make up this prefab
|
||||
pub documents: Vec<RawDocument>,
|
||||
}
|
||||
```
|
||||
|
||||
Each `RawDocument` contains:
|
||||
- `type_id`: Unity type ID (e.g., 1 = GameObject, 4 = Transform)
|
||||
- `file_id`: Unique identifier within the file
|
||||
- `class_name`: The Unity class (e.g., "GameObject", "Transform", "MonoBehaviour")
|
||||
- `yaml`: The raw YAML content as serde_yaml::Value
|
||||
|
||||
### Loading a Prefab
|
||||
|
||||
When you call `UnityFile::from_path("Player.prefab")`, the parser:
|
||||
|
||||
1. Reads the file content
|
||||
2. Validates the Unity YAML header
|
||||
3. Splits the YAML into separate documents (by `--- !u!N &ID` separators)
|
||||
4. Creates `RawDocument` objects with metadata extracted
|
||||
5. Returns `UnityFile::Prefab(UnityPrefab { path, documents })`
|
||||
|
||||
**Important**: At this stage, NO ECS world is created. The prefab stays as raw YAML.
|
||||
|
||||
---
|
||||
|
||||
## Simple Prefab Instantiation
|
||||
|
||||
### Step-by-Step Process
|
||||
|
||||
Let's walk through instantiating a simple prefab (no nesting):
|
||||
|
||||
#### 1. Create a PrefabInstance
|
||||
|
||||
From `unity-parser/src/types/unity_types/prefab_instance.rs:49-70`:
|
||||
|
||||
```rust
|
||||
let prefab = UnityFile::from_path("Player.prefab")?;
|
||||
let mut instance = prefab.instantiate();
|
||||
```
|
||||
|
||||
`instantiate()` calls `PrefabInstance::new()` which:
|
||||
- Clones all documents from the source prefab
|
||||
- Initializes FileID remapping (creates new unique IDs)
|
||||
- Remaps all FileID references in the YAML
|
||||
|
||||
#### 2. Apply Overrides (Optional)
|
||||
|
||||
You can modify the prefab before spawning:
|
||||
|
||||
```rust
|
||||
instance.override_value(file_id, "m_Name", "Player1".into())?;
|
||||
instance.override_value(file_id, "m_LocalPosition.x", 100.0.into())?;
|
||||
```
|
||||
|
||||
This stores overrides in a HashMap that will be applied before spawning.
|
||||
|
||||
#### 3. Spawn into ECS World
|
||||
|
||||
```rust
|
||||
let entities = instance.spawn_into(&mut world, &mut entity_map, guid_resolver, prefab_resolver)?;
|
||||
```
|
||||
|
||||
`spawn_into()` (`prefab_instance.rs:291-309`):
|
||||
1. Applies all stored overrides to the YAML
|
||||
2. Calls `build_world_from_documents_into()` to create entities
|
||||
3. Returns a Vec of spawned entities
|
||||
|
||||
### The Spawning Process
|
||||
|
||||
`build_world_from_documents_into()` from `unity-parser/src/ecs/builder.rs:160-265`:
|
||||
|
||||
**Pass 1**: Create entities for GameObjects
|
||||
- Iterates through documents with `type_id == 1` (GameObject)
|
||||
- Spawns ECS entities with `GameObject` component
|
||||
- Adds to entity_map (FileID → Entity)
|
||||
|
||||
**Pass 2**: Attach components
|
||||
- Iterates through remaining documents (Transform, RectTransform, MonoBehaviour, etc.)
|
||||
- Looks up `m_GameObject` reference to find owner entity
|
||||
- Parses and attaches component to entity
|
||||
|
||||
**Pass 3**: Execute deferred linking
|
||||
- Resolves Transform parent/child relationships
|
||||
- Converts FileID references to Entity handles
|
||||
|
||||
---
|
||||
|
||||
## Nested Prefab System
|
||||
|
||||
### How Unity Represents Nested Prefabs
|
||||
|
||||
When you place a prefab inside another prefab in Unity, it creates a **PrefabInstance** document:
|
||||
|
||||
```yaml
|
||||
--- !u!1001 &1234567890
|
||||
PrefabInstance:
|
||||
m_SourcePrefab: {fileID: 0, guid: "091c537484687e9419460cdcd7038234", type: 3}
|
||||
m_Modification:
|
||||
- target: {fileID: 5678}
|
||||
propertyPath: m_Name
|
||||
value: "ModifiedName"
|
||||
- target: {fileID: 5679}
|
||||
propertyPath: m_LocalPosition.x
|
||||
value: 10.5
|
||||
```
|
||||
|
||||
### PrefabInstanceComponent
|
||||
|
||||
From `unity-parser/src/types/unity_types/prefab_instance.rs:322-366`:
|
||||
|
||||
```rust
|
||||
pub struct PrefabInstanceComponent {
|
||||
/// External reference to the source prefab (by GUID)
|
||||
pub prefab_ref: ExternalRef, // Contains GUID string
|
||||
|
||||
/// Modifications applied to the nested prefab
|
||||
pub modifications: Vec<PrefabModification>,
|
||||
}
|
||||
|
||||
pub struct PrefabModification {
|
||||
pub target_file_id: FileID, // Which object to modify
|
||||
pub property_path: String, // Dot notation: "m_Name", "m_LocalPosition.x"
|
||||
pub value: Value, // The new value
|
||||
}
|
||||
```
|
||||
|
||||
### GUID Resolution
|
||||
|
||||
Before we can instantiate a nested prefab, we need to resolve its GUID to a file path.
|
||||
|
||||
**PrefabGuidResolver** (`unity-parser/src/parser/prefab_guid_resolver.rs`):
|
||||
|
||||
1. **Initialization**: Scans Unity project directory for `.prefab.meta` files
|
||||
2. **GUID Extraction**: Parses each `.meta` file to get the GUID
|
||||
3. **Mapping**: Builds a HashMap: `Guid → PathBuf`
|
||||
|
||||
Example:
|
||||
```
|
||||
Assets/Prefabs/Player.prefab.meta contains:
|
||||
guid: 091c537484687e9419460cdcd7038234
|
||||
|
||||
→ Maps: 0x091c537484687e9419460cdcd7038234 → "Assets/Prefabs/Player.prefab"
|
||||
```
|
||||
|
||||
### PrefabResolver
|
||||
|
||||
**PrefabResolver** (`prefab_instance.rs:430-706`) handles loading and recursive instantiation:
|
||||
|
||||
```rust
|
||||
pub struct PrefabResolver<'a> {
|
||||
/// Cache of loaded prefabs (GUID → Prefab)
|
||||
prefab_cache: HashMap<String, Arc<UnityPrefab>>,
|
||||
|
||||
/// Mapping from GUID to file path
|
||||
guid_to_path: HashMap<String, PathBuf>,
|
||||
|
||||
/// Stack for circular reference detection
|
||||
instantiation_stack: Vec<String>,
|
||||
|
||||
/// GUID resolver for MonoBehaviour scripts
|
||||
guid_resolver: Option<&'a GuidResolver>,
|
||||
|
||||
/// Prefab GUID resolver
|
||||
prefab_guid_resolver: Option<&'a PrefabGuidResolver>,
|
||||
}
|
||||
```
|
||||
|
||||
### Nested Prefab Instantiation Flow
|
||||
|
||||
From `prefab_instance.rs:496-572` - `instantiate_from_component()`:
|
||||
|
||||
```
|
||||
PrefabInstanceComponent found in scene
|
||||
↓
|
||||
1. Extract GUID from component.prefab_ref
|
||||
↓
|
||||
2. Load prefab via GUID resolver: GUID → Path → UnityPrefab
|
||||
↓
|
||||
3. Create PrefabInstance (clone + remap FileIDs)
|
||||
↓
|
||||
4. Apply modifications from component.modifications
|
||||
↓
|
||||
5. Spawn prefab into world (creates entities)
|
||||
↓
|
||||
6. Link spawned root to parent entity (if provided)
|
||||
↓
|
||||
Returns: Vec<Entity> of spawned entities
|
||||
```
|
||||
|
||||
### Recursive Nested Prefabs
|
||||
|
||||
For deeply nested prefabs (prefabs containing prefabs containing prefabs...):
|
||||
|
||||
**instantiate_recursive()** (`prefab_instance.rs:574-643`):
|
||||
|
||||
```
|
||||
Start with root prefab
|
||||
↓
|
||||
1. Check for circular references (using instantiation_stack)
|
||||
↓
|
||||
2. Push prefab ID to stack
|
||||
↓
|
||||
3. Create PrefabInstance
|
||||
↓
|
||||
4. Scan documents for nested PrefabInstance components
|
||||
↓
|
||||
5. For each nested prefab:
|
||||
- Load the referenced prefab by GUID
|
||||
- Apply its modifications
|
||||
- Recursively call instantiate_recursive()
|
||||
↓
|
||||
6. Spawn this prefab's entities
|
||||
↓
|
||||
7. Pop from stack
|
||||
```
|
||||
|
||||
This handles arbitrary nesting depth while preventing infinite loops from circular references.
|
||||
|
||||
---
|
||||
|
||||
## The 4-Pass ECS Building Process
|
||||
|
||||
When parsing a Unity scene, the ECS builder uses a multi-pass approach to handle prefabs.
|
||||
|
||||
From `unity-parser/src/ecs/builder.rs:31-138`:
|
||||
|
||||
### Pass 1: Create GameObject Entities
|
||||
|
||||
```rust
|
||||
for doc in documents.iter().filter(|d| d.type_id == 1 || d.class_name == "GameObject") {
|
||||
let entity = spawn_game_object(&mut world, doc)?;
|
||||
entity_map.insert(doc.file_id, entity);
|
||||
}
|
||||
|
||||
// Also create entities for PrefabInstances
|
||||
for doc in documents.iter().filter(|d| d.type_id == 1001 || d.class_name == "PrefabInstance") {
|
||||
let entity = world.create(());
|
||||
entity_map.insert(doc.file_id, entity);
|
||||
|
||||
// Parse and attach PrefabInstanceComponent
|
||||
if let Some(prefab_comp) = PrefabInstanceComponent::parse(yaml, &ctx) {
|
||||
world.insert(entity, (prefab_comp,));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
At this stage:
|
||||
- All GameObjects → Entities
|
||||
- All PrefabInstances → Entities with `PrefabInstanceComponent` attached
|
||||
- entity_map tracks FileID → Entity
|
||||
|
||||
### Pass 2: Attach Components
|
||||
|
||||
```rust
|
||||
for doc in documents.iter().filter(|d| d.type_id != 1 && d.class_name != "GameObject") {
|
||||
attach_component(&mut world, doc, &linking_ctx, &type_filter, guid_resolver)?;
|
||||
}
|
||||
```
|
||||
|
||||
- Parses Transform, RectTransform, MonoBehaviour, etc.
|
||||
- Looks up `m_GameObject` reference to find owner entity
|
||||
- Attaches parsed component to entity
|
||||
|
||||
### Pass 2.5: Resolve Prefab Instances (NEW!)
|
||||
|
||||
This is where the magic happens for nested prefabs (`builder.rs:92-132`):
|
||||
|
||||
```rust
|
||||
if let Some(prefab_resolver_ref) = prefab_guid_resolver {
|
||||
let mut prefab_resolver = PrefabResolver::from_resolvers(guid_resolver, prefab_resolver_ref);
|
||||
|
||||
// Query for entities with PrefabInstanceComponent
|
||||
let prefab_entities: Vec<_> = world.query::<&PrefabInstanceComponent>().collect();
|
||||
|
||||
for (entity, component) in prefab_entities {
|
||||
// Instantiate the referenced prefab
|
||||
match prefab_resolver.instantiate_from_component(
|
||||
&component,
|
||||
Some(entity), // Parent entity
|
||||
&mut world,
|
||||
&mut entity_map,
|
||||
) {
|
||||
Ok(spawned) => {
|
||||
info!("Spawned {} entities from prefab", spawned.len());
|
||||
}
|
||||
Err(e) => {
|
||||
warn!("Failed to instantiate prefab: {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
// Remove PrefabInstanceComponent after resolution
|
||||
world.remove::<(PrefabInstanceComponent,)>(entity);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Key Points**:
|
||||
1. Only runs if a PrefabGuidResolver is provided
|
||||
2. Finds all entities with `PrefabInstanceComponent`
|
||||
3. For each one:
|
||||
- Resolves GUID → loads prefab
|
||||
- Creates instance with modifications
|
||||
- Spawns into current world
|
||||
- Links to parent entity
|
||||
4. Removes `PrefabInstanceComponent` (no longer needed)
|
||||
|
||||
### Pass 3: Execute Deferred Linking
|
||||
|
||||
```rust
|
||||
let entity_map = linking_ctx.execute_callbacks(&mut world);
|
||||
```
|
||||
|
||||
- Resolves Transform parent/child relationships
|
||||
- Converts FileID references to actual Entity handles
|
||||
- This happens AFTER prefab instantiation so that prefab entities are in the map
|
||||
|
||||
---
|
||||
|
||||
## FileID Remapping
|
||||
|
||||
### Why Remap FileIDs?
|
||||
|
||||
Unity FileIDs are unique within a single file, but when instantiating multiple prefab instances, we need to ensure no collisions:
|
||||
|
||||
```
|
||||
Scene.unity:
|
||||
GameObject &100 ← FileID = 100
|
||||
Transform &101 ← FileID = 101
|
||||
|
||||
Player.prefab (first instance):
|
||||
GameObject &100 ← COLLISION!
|
||||
Transform &200
|
||||
|
||||
Player.prefab (second instance):
|
||||
GameObject &100 ← COLLISION!
|
||||
Transform &200
|
||||
```
|
||||
|
||||
### The Solution
|
||||
|
||||
From `prefab_instance.rs:72-114`:
|
||||
|
||||
**Step 1**: Generate unique IDs for each document
|
||||
|
||||
```rust
|
||||
fn generate_file_id(&mut self) -> FileID {
|
||||
let id = self.next_file_id; // Starts at i64::MAX
|
||||
self.next_file_id -= 1; // Decrement
|
||||
FileID::from_i64(id)
|
||||
}
|
||||
```
|
||||
|
||||
- Uses i64::MAX and decrements: `9223372036854775807`, `9223372036854775806`, ...
|
||||
- Scene FileIDs are typically small positive numbers (1, 100, 1000)
|
||||
- This avoids collisions
|
||||
|
||||
**Step 2**: Build mapping table
|
||||
|
||||
```rust
|
||||
fn initialize_file_id_mapping(&mut self) {
|
||||
for original_id in original_ids {
|
||||
let new_id = self.generate_file_id();
|
||||
self.file_id_map.insert(original_id, new_id);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Example mapping:
|
||||
```
|
||||
Original → New
|
||||
100 → 9223372036854775807
|
||||
200 → 9223372036854775806
|
||||
```
|
||||
|
||||
**Step 3**: Remap all references
|
||||
|
||||
```rust
|
||||
fn remap_yaml_file_refs(&mut self) {
|
||||
// Update document's own FileID
|
||||
for doc in &mut self.documents {
|
||||
doc.file_id = self.file_id_map[&doc.file_id];
|
||||
}
|
||||
|
||||
// Update all FileRef references in YAML: {fileID: N}
|
||||
for doc in &mut self.documents {
|
||||
Self::remap_value(&mut doc.yaml, &file_id_map);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This recursively walks the YAML tree and replaces all `{fileID: 100}` with `{fileID: 9223372036854775807}`.
|
||||
|
||||
### Handling Overrides
|
||||
|
||||
When applying overrides before spawning:
|
||||
|
||||
```rust
|
||||
instance.override_value(original_file_id, "m_Name", "Player1".into())?;
|
||||
```
|
||||
|
||||
The API accepts the **original** FileID for convenience, but internally:
|
||||
|
||||
```rust
|
||||
fn apply_overrides(&mut self) -> Result<()> {
|
||||
for ((file_id, path), value) in &self.overrides {
|
||||
// Map original FileID → remapped FileID
|
||||
let remapped_id = self.file_id_map.get(file_id)?;
|
||||
|
||||
// Find document with remapped ID
|
||||
let doc = self.documents.iter_mut()
|
||||
.find(|d| d.file_id == *remapped_id)?;
|
||||
|
||||
// Apply value change
|
||||
set_yaml_value(&mut doc.yaml, path, value)?;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Code Examples
|
||||
|
||||
### Example 1: Manual Prefab Instantiation
|
||||
|
||||
```rust
|
||||
use unity_parser::{UnityFile, UnityPrefab};
|
||||
use sparsey::World;
|
||||
use std::collections::HashMap;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load prefab
|
||||
let file = UnityFile::from_path("Assets/Prefabs/Player.prefab")?;
|
||||
let prefab = match file {
|
||||
UnityFile::Prefab(p) => p,
|
||||
_ => panic!("Expected prefab"),
|
||||
};
|
||||
|
||||
// Create instance with modifications
|
||||
let mut instance = prefab.instantiate();
|
||||
instance.override_value(file_id, "m_Name", "Player1".into())?;
|
||||
instance.override_value(file_id, "m_LocalPosition.x", 100.0.into())?;
|
||||
|
||||
// Spawn into world
|
||||
let mut world = World::new();
|
||||
let mut entity_map = HashMap::new();
|
||||
let entities = instance.spawn_into(&mut world, &mut entity_map, None, None)?;
|
||||
|
||||
println!("Spawned {} entities", entities.len());
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
### Example 2: Automatic Scene Parsing with Nested Prefabs
|
||||
|
||||
```rust
|
||||
use unity_parser::UnityProject;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Initialize project (builds GUID resolvers)
|
||||
let project = UnityProject::from_path("/home/user/UnityProject")?;
|
||||
|
||||
// Parse scene - automatically resolves and instantiates prefabs
|
||||
let scene = project.parse_scene("Assets/Scenes/Level1.unity")?;
|
||||
|
||||
println!("Scene has {} entities", scene.entity_map.len());
|
||||
|
||||
// Query entities
|
||||
let game_objects = scene.world.borrow::<unity_parser::GameObject>();
|
||||
let transforms = scene.world.borrow::<unity_parser::Transform>();
|
||||
|
||||
for (file_id, entity) in &scene.entity_map {
|
||||
if let Some(go) = game_objects.get(*entity) {
|
||||
if let Some(transform) = transforms.get(*entity) {
|
||||
println!("GameObject '{}' at {:?}",
|
||||
go.name(), transform.local_position());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
### Example 3: Recursive Prefab Loading
|
||||
|
||||
```rust
|
||||
use unity_parser::{UnityFile, PrefabResolver, PrefabGuidResolver};
|
||||
use sparsey::World;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Build prefab GUID resolver
|
||||
let prefab_guid_resolver = PrefabGuidResolver::from_project("UnityProject")?;
|
||||
|
||||
// Create prefab resolver
|
||||
let mut prefab_resolver = PrefabResolver::from_resolvers(None, &prefab_guid_resolver);
|
||||
|
||||
// Load a prefab with nested prefabs
|
||||
let file = UnityFile::from_path("Assets/Prefabs/ComplexPrefab.prefab")?;
|
||||
let prefab = match file {
|
||||
UnityFile::Prefab(p) => p,
|
||||
_ => panic!("Expected prefab"),
|
||||
};
|
||||
|
||||
// Recursively instantiate (handles nested prefabs automatically)
|
||||
let mut world = World::new();
|
||||
let mut entity_map = HashMap::new();
|
||||
let entities = prefab_resolver.instantiate_recursive(
|
||||
&prefab,
|
||||
&mut world,
|
||||
&mut entity_map,
|
||||
)?;
|
||||
|
||||
println!("Recursively spawned {} entities", entities.len());
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
### Prefab Instantiation Flow
|
||||
|
||||
```
|
||||
UnityPrefab (raw YAML)
|
||||
↓
|
||||
instantiate()
|
||||
↓
|
||||
PrefabInstance (cloned YAML with remapped FileIDs)
|
||||
↓
|
||||
override_value() (optional)
|
||||
↓
|
||||
spawn_into()
|
||||
↓
|
||||
ECS World (Sparsey entities with components)
|
||||
```
|
||||
|
||||
### Nested Prefab Resolution Flow
|
||||
|
||||
```
|
||||
Scene contains PrefabInstance document
|
||||
↓
|
||||
Pass 1: Create entity with PrefabInstanceComponent
|
||||
↓
|
||||
Pass 2.5: Find all PrefabInstanceComponent entities
|
||||
↓
|
||||
For each: GUID → Path → Load Prefab
|
||||
↓
|
||||
Create instance + apply modifications
|
||||
↓
|
||||
Recursively check for nested PrefabInstances
|
||||
↓
|
||||
Spawn all entities into world
|
||||
↓
|
||||
Link to parent entity
|
||||
```
|
||||
|
||||
### Key Takeaways
|
||||
|
||||
1. **Prefabs stay as YAML** until instantiation for efficient cloning and overrides
|
||||
2. **FileID remapping** prevents collisions when instantiating multiple times
|
||||
3. **PrefabGuidResolver** maps GUIDs to file paths for automatic loading
|
||||
4. **Pass 2.5** in the ECS builder handles automatic prefab instantiation
|
||||
5. **Recursive instantiation** handles arbitrary nesting depth with circular reference detection
|
||||
6. **Modifications** are stored as property path + value pairs and applied before spawning
|
||||
|
||||
### Files to Explore
|
||||
|
||||
- `unity-parser/src/types/unity_types/prefab_instance.rs` - PrefabInstance, PrefabResolver
|
||||
- `unity-parser/src/parser/prefab_guid_resolver.rs` - GUID → Path mapping
|
||||
- `unity-parser/src/ecs/builder.rs` - 4-pass ECS building with prefab resolution
|
||||
- `unity-parser/src/model/mod.rs` - UnityPrefab, UnityScene data structures
|
||||
107
cursebreaker-parser/examples/fast_travel_example.rs
Normal file
107
cursebreaker-parser/examples/fast_travel_example.rs
Normal file
@@ -0,0 +1,107 @@
|
||||
use cursebreaker_parser::{FastTravelDatabase, FastTravelType};
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load all fast travel types from the directory
|
||||
let ft_db = FastTravelDatabase::load_from_directory("/home/connor/repos/CBAssets/Data/XMLs")?;
|
||||
|
||||
println!("=== Fast Travel Database Statistics ===");
|
||||
println!("Total locations: {}", ft_db.len());
|
||||
println!("Regular locations: {}", ft_db.count_by_type(FastTravelType::Location));
|
||||
println!("Canoe locations: {}", ft_db.count_by_type(FastTravelType::Canoe));
|
||||
println!("Portal locations: {}", ft_db.count_by_type(FastTravelType::Portal));
|
||||
println!();
|
||||
|
||||
// Show regular locations
|
||||
println!("=== Regular Fast Travel Locations ===");
|
||||
let locations = ft_db.get_locations();
|
||||
for loc in locations.iter().take(5) {
|
||||
println!(" [{}] {} (unlocked: {})", loc.id, loc.name, loc.unlocked);
|
||||
if let Some(ref connections) = loc.connections {
|
||||
println!(" Connections: {}", connections);
|
||||
}
|
||||
}
|
||||
println!("... and {} more", locations.len().saturating_sub(5));
|
||||
println!();
|
||||
|
||||
// Show canoe locations
|
||||
println!("=== Canoe Fast Travel Locations ===");
|
||||
let canoe_locs = ft_db.get_canoe_locations();
|
||||
for loc in &canoe_locs {
|
||||
println!(" [{}] {}", loc.id, loc.name);
|
||||
if let Some(ref checks) = loc.checks {
|
||||
println!(" Requirements: {}", checks);
|
||||
}
|
||||
}
|
||||
println!("Total: {}", canoe_locs.len());
|
||||
println!();
|
||||
|
||||
// Show portals
|
||||
println!("=== Portal Fast Travel Locations ===");
|
||||
let portals = ft_db.get_portals();
|
||||
for portal in portals.iter().take(5) {
|
||||
println!(" [{}] {}", portal.id, portal.name);
|
||||
if let Some((x, y, z)) = portal.get_position() {
|
||||
println!(" Position: ({:.2}, {:.2}, {:.2})", x, y, z);
|
||||
}
|
||||
}
|
||||
println!("... and {} more", portals.len().saturating_sub(5));
|
||||
println!();
|
||||
|
||||
// Show unlocked locations
|
||||
println!("=== Unlocked Locations ===");
|
||||
let unlocked = ft_db.get_unlocked_locations();
|
||||
for loc in unlocked.iter().take(10) {
|
||||
println!(" [{}] {}", loc.id, loc.name);
|
||||
}
|
||||
println!("Total unlocked: {}", unlocked.len());
|
||||
println!();
|
||||
|
||||
// Show locations with requirements
|
||||
println!("=== Locations with Requirements ===");
|
||||
let with_reqs = ft_db.get_locations_with_requirements();
|
||||
for loc in &with_reqs {
|
||||
println!(" [{}] {} - {}", loc.id, loc.name, loc.checks.as_ref().unwrap());
|
||||
}
|
||||
println!("Total with requirements: {}", with_reqs.len());
|
||||
println!();
|
||||
|
||||
// Show locations requiring specific trait
|
||||
println!("=== Locations requiring Trait 273 ===");
|
||||
let trait_locs = ft_db.get_locations_requiring_trait(273);
|
||||
for loc in &trait_locs {
|
||||
println!(" [{}] {}", loc.id, loc.name);
|
||||
}
|
||||
println!("Total: {}", trait_locs.len());
|
||||
println!();
|
||||
|
||||
// Show connected locations
|
||||
println!("=== Connected Locations (examples) ===");
|
||||
let connected = ft_db.get_connected_locations();
|
||||
for loc in connected.iter().take(5) {
|
||||
println!(
|
||||
" [{}] {} connects to: {}",
|
||||
loc.id,
|
||||
loc.name,
|
||||
loc.connections.as_ref().unwrap()
|
||||
);
|
||||
}
|
||||
println!("Total connected: {}", connected.len());
|
||||
println!();
|
||||
|
||||
// Find a specific location by ID
|
||||
if let Some(loc) = ft_db.get_by_id(4) {
|
||||
println!("=== Location Details (ID 4) ===");
|
||||
println!("Name: {}", loc.name);
|
||||
println!("Type: {}", loc.travel_type);
|
||||
println!("Position: {}", loc.position);
|
||||
if let Some(ref checks) = loc.checks {
|
||||
println!("Requirements: {}", checks);
|
||||
println!("Parsed checks:");
|
||||
for (check_type, value) in loc.parse_checks() {
|
||||
println!(" - {} = {}", check_type, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
91
cursebreaker-parser/examples/maps_example.rs
Normal file
91
cursebreaker-parser/examples/maps_example.rs
Normal file
@@ -0,0 +1,91 @@
|
||||
use cursebreaker_parser::MapDatabase;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load the Maps.xml file
|
||||
let map_db = MapDatabase::load_from_xml("/home/connor/repos/CBAssets/Data/XMLs/Maps/Maps.xml")?;
|
||||
|
||||
println!("=== Map Database Statistics ===");
|
||||
println!("Total maps loaded: {}", map_db.len());
|
||||
println!("Total named maps: {}", map_db.get_named_maps().len());
|
||||
println!("Total indoor maps: {}", map_db.get_indoor_maps().len());
|
||||
println!("Total isolated maps: {}", map_db.get_isolated_maps().len());
|
||||
println!();
|
||||
|
||||
// Show map bounds
|
||||
if let Some(((min_x, min_y), (max_x, max_y))) = map_db.get_map_bounds() {
|
||||
println!("=== Map Grid Bounds ===");
|
||||
println!("X range: {} to {}", min_x, max_x);
|
||||
println!("Y range: {} to {}", min_y, max_y);
|
||||
println!();
|
||||
}
|
||||
|
||||
// Show some specific maps
|
||||
println!("=== Sample Maps ===");
|
||||
|
||||
if let Some(map) = map_db.get_by_scene_id("3,10") {
|
||||
println!("Map at 3,10:");
|
||||
println!(" Name: {}", if map.name.is_empty() { "(unnamed)" } else { &map.name });
|
||||
println!(" Music: {}", map.music);
|
||||
println!(" Ambience: {}", map.ambience);
|
||||
println!(" Indoor: {}", map.indoors);
|
||||
if let Some(ref fog_color) = map.fog_color {
|
||||
println!(" Fog color: {}", fog_color);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
|
||||
// Show Haywind maps
|
||||
println!("=== Maps named 'Haywind' ===");
|
||||
let haywind_maps = map_db.get_by_name("Haywind");
|
||||
for map in &haywind_maps {
|
||||
println!(" Scene ID: {} (Music: {})", map.scene_id, map.music);
|
||||
}
|
||||
println!("Total: {}", haywind_maps.len());
|
||||
println!();
|
||||
|
||||
// Show Thornhill City maps
|
||||
println!("=== Maps named 'Thornhill City' ===");
|
||||
let thornhill_maps = map_db.get_by_name("Thornhill City");
|
||||
for map in þhill_maps {
|
||||
println!(" Scene ID: {} (Music: {})", map.scene_id, map.music);
|
||||
}
|
||||
println!("Total: {}", thornhill_maps.len());
|
||||
println!();
|
||||
|
||||
// Show all unique map names (first 20)
|
||||
println!("=== Unique Map Names (first 20) ===");
|
||||
let mut names = map_db.get_all_map_names();
|
||||
names.sort();
|
||||
for name in names.iter().take(20) {
|
||||
println!(" {}", name);
|
||||
}
|
||||
println!("... and {} more", names.len().saturating_sub(20));
|
||||
println!();
|
||||
|
||||
// Show maps with respawn locations
|
||||
println!("=== Maps with Respawn Locations ===");
|
||||
let respawn_maps = map_db.get_maps_with_respawn();
|
||||
for map in respawn_maps.iter().take(5) {
|
||||
println!(
|
||||
" {} -> respawns at {}",
|
||||
map.scene_id,
|
||||
map.respawn_map.as_ref().unwrap_or(&"?".to_string())
|
||||
);
|
||||
}
|
||||
println!("Total maps with respawn: {}", respawn_maps.len());
|
||||
println!();
|
||||
|
||||
// Show connected maps
|
||||
println!("=== Connected Maps (examples) ===");
|
||||
let connected = map_db.get_connected_maps();
|
||||
for map in connected.iter().take(5) {
|
||||
println!(
|
||||
" {} connects to: {}",
|
||||
map.scene_id,
|
||||
map.connected_maps.as_ref().unwrap_or(&"?".to_string())
|
||||
);
|
||||
}
|
||||
println!("Total connected maps: {}", connected.len());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
98
cursebreaker-parser/examples/player_houses_example.rs
Normal file
98
cursebreaker-parser/examples/player_houses_example.rs
Normal file
@@ -0,0 +1,98 @@
|
||||
use cursebreaker_parser::PlayerHouseDatabase;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load all player houses from XML
|
||||
let ph_db = PlayerHouseDatabase::load_from_xml(
|
||||
"/home/connor/repos/CBAssets/Data/XMLs/PlayerHouses/PlayerHouses.xml",
|
||||
)?;
|
||||
|
||||
println!("=== Player House Database Statistics ===");
|
||||
println!("Total houses: {}", ph_db.len());
|
||||
println!("Visible houses: {}", ph_db.get_visible_houses().len());
|
||||
println!("Hidden houses: {}", ph_db.get_hidden_houses().len());
|
||||
println!();
|
||||
|
||||
// Show all houses sorted by price
|
||||
println!("=== All Houses (sorted by price) ===");
|
||||
let sorted = ph_db.get_sorted_by_price();
|
||||
for house in &sorted {
|
||||
let visibility = if house.hidden { "(hidden)" } else { "" };
|
||||
println!(
|
||||
" [{}] {} - {} gold {}",
|
||||
house.id, house.name, house.price, visibility
|
||||
);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show price tiers
|
||||
println!("=== Houses by Price Tier ===");
|
||||
println!("Free houses (tier 0):");
|
||||
for house in ph_db.get_by_price_tier(0) {
|
||||
println!(" - {}", house.name);
|
||||
}
|
||||
|
||||
println!("\nCheap houses (tier 1, < 5000 gold):");
|
||||
for house in ph_db.get_by_price_tier(1) {
|
||||
println!(" - {} ({} gold)", house.name, house.price);
|
||||
}
|
||||
|
||||
println!("\nModerate houses (tier 2, 5000-10000 gold):");
|
||||
for house in ph_db.get_by_price_tier(2) {
|
||||
println!(" - {} ({} gold)", house.name, house.price);
|
||||
}
|
||||
|
||||
println!("\nExpensive houses (tier 3, 10000+ gold):");
|
||||
for house in ph_db.get_by_price_tier(3) {
|
||||
println!(" - {} ({} gold)", house.name, house.price);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show cheapest and most expensive
|
||||
println!("=== Price Extremes ===");
|
||||
if let Some(cheapest) = ph_db.get_cheapest() {
|
||||
println!(
|
||||
"Cheapest house: {} - {} gold",
|
||||
cheapest.name, cheapest.price
|
||||
);
|
||||
}
|
||||
if let Some(most_expensive) = ph_db.get_most_expensive() {
|
||||
println!(
|
||||
"Most expensive: {} - {} gold",
|
||||
most_expensive.name, most_expensive.price
|
||||
);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show houses in a specific price range
|
||||
println!("=== Houses between 3000-5000 gold ===");
|
||||
let mid_range = ph_db.get_by_price_range(3000, 5000);
|
||||
for house in mid_range {
|
||||
println!(" - {} ({} gold)", house.name, house.price);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show affordable houses
|
||||
println!("=== Affordable Houses (< 5000 gold) ===");
|
||||
let affordable = ph_db.get_affordable_houses();
|
||||
for house in &affordable {
|
||||
println!(" - {} ({} gold)", house.name, house.price);
|
||||
}
|
||||
println!("Total affordable: {}", affordable.len());
|
||||
println!();
|
||||
|
||||
// Show details of a specific house
|
||||
if let Some(house) = ph_db.get_by_id(8) {
|
||||
println!("=== House Details (ID 8) ===");
|
||||
println!("Name: {}", house.name);
|
||||
println!("Description: {}", house.description);
|
||||
println!("Price: {} gold", house.price);
|
||||
println!("Position: {}", house.position);
|
||||
if let Some((x, y, z)) = house.get_position() {
|
||||
println!("Coordinates: ({:.2}, {:.2}, {:.2})", x, y, z);
|
||||
}
|
||||
println!("Hidden: {}", house.hidden);
|
||||
println!("Price tier: {}", house.get_price_tier());
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
132
cursebreaker-parser/examples/shops_example.rs
Normal file
132
cursebreaker-parser/examples/shops_example.rs
Normal file
@@ -0,0 +1,132 @@
|
||||
use cursebreaker_parser::ShopDatabase;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load all shops from XML
|
||||
let shop_db = ShopDatabase::load_from_xml(
|
||||
"/home/connor/repos/CBAssets/Data/XMLs/Shops/Shops.xml",
|
||||
)?;
|
||||
|
||||
println!("=== Shop Database Statistics ===");
|
||||
println!("Total shops: {}", shop_db.len());
|
||||
println!("General stores: {}", shop_db.get_general_stores().len());
|
||||
println!("Specialized shops: {}", shop_db.get_specialized_shops().len());
|
||||
println!("Non-empty shops: {}", shop_db.get_non_empty_shops().len());
|
||||
println!("Total items across all shops: {}", shop_db.total_item_count());
|
||||
println!("Unique items sold: {}", shop_db.get_all_item_ids().len());
|
||||
println!();
|
||||
|
||||
// Show all general stores
|
||||
println!("=== General Stores ===");
|
||||
let general_stores = shop_db.get_general_stores();
|
||||
for shop in &general_stores {
|
||||
println!(" [ID {}] {} ({} items)", shop.shop_id, shop.name, shop.item_count());
|
||||
if let Some(ref comment) = shop.comment {
|
||||
println!(" Comment: {}", comment);
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show some specialized shops
|
||||
println!("=== Specialized Shops (first 10) ===");
|
||||
let specialized = shop_db.get_specialized_shops();
|
||||
for shop in specialized.iter().take(10) {
|
||||
println!(" [ID {}] {} ({} items)", shop.shop_id, shop.name, shop.item_count());
|
||||
if let Some(ref comment) = shop.comment {
|
||||
println!(" Comment: {}", comment);
|
||||
}
|
||||
}
|
||||
println!("... and {} more", specialized.len().saturating_sub(10));
|
||||
println!();
|
||||
|
||||
// Show details of a specific shop
|
||||
if let Some(shop) = shop_db.get_by_id(3) {
|
||||
println!("=== Shop Details (ID 3) ===");
|
||||
println!("Name: {}", shop.name);
|
||||
println!("Is General Store: {}", shop.is_general_store);
|
||||
println!("Total items: {}", shop.item_count());
|
||||
println!("\nItems:");
|
||||
for (i, item) in shop.items.iter().take(10).enumerate() {
|
||||
print!(" {}) Item ID: {}", i + 1, item.item_id);
|
||||
if let Some(ref name) = item.name {
|
||||
print!(" ({})", name);
|
||||
}
|
||||
if let Some(price) = item.price {
|
||||
print!(" - {} gold", price);
|
||||
}
|
||||
if let Some(stock) = item.max_stock {
|
||||
print!(" - max stock: {}", stock);
|
||||
}
|
||||
if let Some(restock) = item.restock_time {
|
||||
print!(" - restock: {}s", restock);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
if shop.item_count() > 10 {
|
||||
println!(" ... and {} more items", shop.item_count() - 10);
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show shops that sell a specific item
|
||||
println!("=== Shops Selling Item '167' (Fishing Rod) ===");
|
||||
let fishing_rod_shops = shop_db.get_shops_selling_item("167");
|
||||
for shop in &fishing_rod_shops {
|
||||
println!(" [ID {}] {}", shop.shop_id, shop.name);
|
||||
if let Some(item) = shop.get_item_by_id("167") {
|
||||
if let Some(ref name) = item.name {
|
||||
print!(" - {}", name);
|
||||
}
|
||||
if let Some(price) = item.price {
|
||||
print!(" (custom price: {} gold)", price);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show shop with most items
|
||||
if let Some(largest_shop) = shop_db.all_shops().iter().max_by_key(|s| s.item_count()) {
|
||||
println!("=== Largest Shop ===");
|
||||
println!("Name: {}", largest_shop.name);
|
||||
println!("Item count: {}", largest_shop.item_count());
|
||||
println!();
|
||||
}
|
||||
|
||||
// Show items with unlimited stock in a shop
|
||||
if let Some(shop) = shop_db.get_by_id(3) {
|
||||
println!("=== Unlimited Stock Items in {} ===", shop.name);
|
||||
let unlimited = shop.get_unlimited_stock_items();
|
||||
for item in unlimited.iter().take(5) {
|
||||
print!(" Item ID: {}", item.item_id);
|
||||
if let Some(ref name) = item.name {
|
||||
print!(" ({})", name);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
if unlimited.len() > 5 {
|
||||
println!(" ... and {} more", unlimited.len() - 5);
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show items with limited stock
|
||||
if let Some(shop) = shop_db.get_by_id(8) {
|
||||
println!("=== Limited Stock Items in Shop ID 8 ===");
|
||||
let limited = shop.get_limited_stock_items();
|
||||
for item in &limited {
|
||||
print!(" Item ID: {}", item.item_id);
|
||||
if let Some(ref name) = item.name {
|
||||
print!(" ({})", name);
|
||||
}
|
||||
if let Some(stock) = item.max_stock {
|
||||
print!(" - max stock: {}", stock);
|
||||
}
|
||||
if let Some(minutes) = item.get_restock_minutes() {
|
||||
print!(" - restocks every {:.1} min", minutes);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
106
cursebreaker-parser/examples/traits_example.rs
Normal file
106
cursebreaker-parser/examples/traits_example.rs
Normal file
@@ -0,0 +1,106 @@
|
||||
use cursebreaker_parser::TraitDatabase;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
// Load all traits from XML
|
||||
let trait_db = TraitDatabase::load_from_xml(
|
||||
"/home/connor/repos/CBAssets/Data/XMLs/Traits/Traits.xml",
|
||||
)?;
|
||||
|
||||
println!("=== Trait Database Statistics ===");
|
||||
println!("Total traits: {}", trait_db.len());
|
||||
println!("Trainer traits: {}", trait_db.get_trainer_traits().len());
|
||||
println!("Ability traits: {}", trait_db.get_ability_traits().len());
|
||||
println!("Novice traits: {}", trait_db.get_novice_traits().len());
|
||||
println!("Experienced traits: {}", trait_db.get_experienced_traits().len());
|
||||
println!("Master traits: {}", trait_db.get_master_traits().len());
|
||||
println!();
|
||||
|
||||
// Show all skills
|
||||
println!("=== All Skills ===");
|
||||
let mut skills = trait_db.get_all_skills();
|
||||
skills.sort();
|
||||
for skill in &skills {
|
||||
let count = trait_db.get_by_skill(skill).len();
|
||||
println!(" {} ({} traits)", skill, count);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show traits for a specific skill
|
||||
println!("=== Woodcutting Traits (sorted by level) ===");
|
||||
let woodcutting = trait_db.get_sorted_by_level("woodcutting");
|
||||
for trait_obj in woodcutting.iter().take(10) {
|
||||
if let Some(level) = trait_obj.get_required_level() {
|
||||
let tier = if trait_obj.is_novice() {
|
||||
" (Novice)"
|
||||
} else if trait_obj.is_experienced() {
|
||||
" (Experienced)"
|
||||
} else if trait_obj.is_master() {
|
||||
" (Master)"
|
||||
} else {
|
||||
""
|
||||
};
|
||||
println!(" [Lvl {}] {}{}", level, trait_obj.name, tier);
|
||||
}
|
||||
}
|
||||
println!("... and {} more", woodcutting.len().saturating_sub(10));
|
||||
println!();
|
||||
|
||||
// Show master tier traits
|
||||
println!("=== Master Tier Traits ===");
|
||||
let masters = trait_db.get_master_traits();
|
||||
for trait_obj in &masters {
|
||||
if let (Some(skill), Some(level)) = (trait_obj.get_required_skill(), trait_obj.get_required_level()) {
|
||||
println!(" {} - {} (Level {})", trait_obj.name, skill, level);
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show ability traits
|
||||
println!("=== Traits that Teach Abilities (first 10) ===");
|
||||
let abilities = trait_db.get_ability_traits();
|
||||
for trait_obj in abilities.iter().take(10) {
|
||||
if let Some(ability_id) = trait_obj.learnability {
|
||||
println!(
|
||||
" {} - teaches ability {}",
|
||||
trait_obj.name, ability_id
|
||||
);
|
||||
if let (Some(skill), Some(level)) = (trait_obj.get_required_skill(), trait_obj.get_required_level()) {
|
||||
println!(" Requires: {} level {}", skill, level);
|
||||
}
|
||||
}
|
||||
}
|
||||
println!("... and {} more", abilities.len().saturating_sub(10));
|
||||
println!();
|
||||
|
||||
// Show traits by level range
|
||||
println!("=== Combat Traits (Levels 15-25) ===");
|
||||
let combat_traits = trait_db.get_by_skill_and_level("swordsmanship", 15, 25);
|
||||
for trait_obj in &combat_traits {
|
||||
if let Some(level) = trait_obj.get_required_level() {
|
||||
println!(" [Lvl {}] {}", level, trait_obj.name);
|
||||
}
|
||||
}
|
||||
println!();
|
||||
|
||||
// Show details of a specific trait
|
||||
if let Some(trait_obj) = trait_db.get_by_id(272) {
|
||||
println!("=== Trait Details (ID 272) ===");
|
||||
println!("Name: {}", trait_obj.name);
|
||||
println!("Description (plain): {}", trait_obj.get_plain_description());
|
||||
if let Some(ref trainer) = trait_obj.trainer {
|
||||
println!("Skill: {}", trainer.skill);
|
||||
println!("Level: {}", trainer.level);
|
||||
if let Some(tier) = trainer.tier_icon {
|
||||
println!("Tier: {}", tier);
|
||||
}
|
||||
}
|
||||
if let Some(ability_id) = trait_obj.learnability {
|
||||
println!("Teaches ability: {}", ability_id);
|
||||
}
|
||||
if let Some(ref comment) = trait_obj.comment {
|
||||
println!("Comment: {}", comment);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
272
cursebreaker-parser/src/databases/fast_travel_database.rs
Normal file
272
cursebreaker-parser/src/databases/fast_travel_database.rs
Normal file
@@ -0,0 +1,272 @@
|
||||
use crate::types::{FastTravelLocation, FastTravelType};
|
||||
use crate::xml_parser::{
|
||||
parse_fast_travel_canoe_xml, parse_fast_travel_locations_xml, parse_fast_travel_portals_xml,
|
||||
XmlParseError,
|
||||
};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// A database for managing Fast Travel Locations loaded from XML files
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct FastTravelDatabase {
|
||||
locations: Vec<FastTravelLocation>,
|
||||
// Map ID -> location index
|
||||
locations_by_id: HashMap<i32, usize>,
|
||||
// Map name -> list of location indices
|
||||
locations_by_name: HashMap<String, Vec<usize>>,
|
||||
// Map type -> list of location indices
|
||||
locations_by_type: HashMap<FastTravelType, Vec<usize>>,
|
||||
}
|
||||
|
||||
impl FastTravelDatabase {
|
||||
/// Create a new empty FastTravelDatabase
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
locations: Vec::new(),
|
||||
locations_by_id: HashMap::new(),
|
||||
locations_by_name: HashMap::new(),
|
||||
locations_by_type: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Load all fast travel types from their respective XML files in a directory
|
||||
/// Expects the directory structure:
|
||||
/// - dir/FastTravelLocations/FastTravelLocations.xml
|
||||
/// - dir/FastTravelCanoe/FastTravelCanoe.xml
|
||||
/// - dir/FastTravelPortals/FastTravelPortals.xml
|
||||
pub fn load_from_directory<P: AsRef<Path>>(dir: P) -> Result<Self, XmlParseError> {
|
||||
let dir = dir.as_ref();
|
||||
let mut db = Self::new();
|
||||
|
||||
// Load regular locations
|
||||
let locations_path = dir.join("FastTravelLocations/FastTravelLocations.xml");
|
||||
if locations_path.exists() {
|
||||
let locations = parse_fast_travel_locations_xml(&locations_path)?;
|
||||
db.add_locations(locations);
|
||||
}
|
||||
|
||||
// Load canoe locations
|
||||
let canoe_path = dir.join("FastTravelCanoe/FastTravelCanoe.xml");
|
||||
if canoe_path.exists() {
|
||||
let canoe_locations = parse_fast_travel_canoe_xml(&canoe_path)?;
|
||||
db.add_locations(canoe_locations);
|
||||
}
|
||||
|
||||
// Load portal locations
|
||||
let portals_path = dir.join("FastTravelPortals/FastTravelPortals.xml");
|
||||
if portals_path.exists() {
|
||||
let portals = parse_fast_travel_portals_xml(&portals_path)?;
|
||||
db.add_locations(portals);
|
||||
}
|
||||
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Load only regular fast travel locations from XML
|
||||
pub fn load_locations_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let locations = parse_fast_travel_locations_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_locations(locations);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Load only canoe fast travel locations from XML
|
||||
pub fn load_canoe_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let locations = parse_fast_travel_canoe_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_locations(locations);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Load only portal fast travel locations from XML
|
||||
pub fn load_portals_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let locations = parse_fast_travel_portals_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_locations(locations);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Add fast travel locations to the database
|
||||
pub fn add_locations(&mut self, locations: Vec<FastTravelLocation>) {
|
||||
for location in locations {
|
||||
let index = self.locations.len();
|
||||
|
||||
// Index by ID
|
||||
self.locations_by_id.insert(location.id, index);
|
||||
|
||||
// Index by name
|
||||
self.locations_by_name
|
||||
.entry(location.name.clone())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
|
||||
// Index by type
|
||||
self.locations_by_type
|
||||
.entry(location.travel_type)
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
|
||||
self.locations.push(location);
|
||||
}
|
||||
}
|
||||
|
||||
/// Get a fast travel location by ID
|
||||
pub fn get_by_id(&self, id: i32) -> Option<&FastTravelLocation> {
|
||||
self.locations_by_id
|
||||
.get(&id)
|
||||
.and_then(|&index| self.locations.get(index))
|
||||
}
|
||||
|
||||
/// Get fast travel locations by name (returns all locations with matching name)
|
||||
pub fn get_by_name(&self, name: &str) -> Vec<&FastTravelLocation> {
|
||||
self.locations_by_name
|
||||
.get(name)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.locations.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all locations
|
||||
pub fn all_locations(&self) -> &[FastTravelLocation] {
|
||||
&self.locations
|
||||
}
|
||||
|
||||
/// Get all locations of a specific type
|
||||
pub fn get_by_type(&self, travel_type: FastTravelType) -> Vec<&FastTravelLocation> {
|
||||
self.locations_by_type
|
||||
.get(&travel_type)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.locations.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all regular fast travel locations
|
||||
pub fn get_locations(&self) -> Vec<&FastTravelLocation> {
|
||||
self.get_by_type(FastTravelType::Location)
|
||||
}
|
||||
|
||||
/// Get all canoe fast travel locations
|
||||
pub fn get_canoe_locations(&self) -> Vec<&FastTravelLocation> {
|
||||
self.get_by_type(FastTravelType::Canoe)
|
||||
}
|
||||
|
||||
/// Get all portal fast travel locations
|
||||
pub fn get_portals(&self) -> Vec<&FastTravelLocation> {
|
||||
self.get_by_type(FastTravelType::Portal)
|
||||
}
|
||||
|
||||
/// Get all unlocked locations (regular locations only)
|
||||
pub fn get_unlocked_locations(&self) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.unlocked)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all locations with requirements
|
||||
pub fn get_locations_with_requirements(&self) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.has_requirements())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all locations that have connections to other locations
|
||||
pub fn get_connected_locations(&self) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.has_connections())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get locations that are connected to a specific location ID
|
||||
pub fn get_locations_connected_to(&self, id: i32) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.get_connections().contains(&id))
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get locations that require a specific quest
|
||||
pub fn get_locations_requiring_quest(&self, quest_id: &str) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.requires_quest(quest_id))
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get locations that require a specific trait
|
||||
pub fn get_locations_requiring_trait(&self, trait_id: i32) -> Vec<&FastTravelLocation> {
|
||||
self.locations
|
||||
.iter()
|
||||
.filter(|loc| loc.requires_trait(trait_id))
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all unique location names
|
||||
pub fn get_all_names(&self) -> Vec<String> {
|
||||
self.locations_by_name.keys().cloned().collect()
|
||||
}
|
||||
|
||||
/// Get count by type
|
||||
pub fn count_by_type(&self, travel_type: FastTravelType) -> usize {
|
||||
self.locations_by_type
|
||||
.get(&travel_type)
|
||||
.map(|v| v.len())
|
||||
.unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Get number of locations in database
|
||||
pub fn len(&self) -> usize {
|
||||
self.locations.len()
|
||||
}
|
||||
|
||||
/// Check if database is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.locations.is_empty()
|
||||
}
|
||||
|
||||
/// Prepare fast travel locations for SQL insertion
|
||||
/// Returns a vector of tuples (id, name, type, json_data)
|
||||
pub fn prepare_for_sql(&self) -> Vec<(i32, String, String, String)> {
|
||||
self.locations
|
||||
.iter()
|
||||
.map(|location| {
|
||||
let json =
|
||||
serde_json::to_string(location).unwrap_or_else(|_| "{}".to_string());
|
||||
(
|
||||
location.id,
|
||||
location.name.clone(),
|
||||
location.travel_type.to_string(),
|
||||
json,
|
||||
)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for FastTravelDatabase {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_fast_travel_database_basic() {
|
||||
let mut db = FastTravelDatabase::new();
|
||||
assert!(db.is_empty());
|
||||
assert_eq!(db.len(), 0);
|
||||
}
|
||||
}
|
||||
208
cursebreaker-parser/src/databases/map_database.rs
Normal file
208
cursebreaker-parser/src/databases/map_database.rs
Normal file
@@ -0,0 +1,208 @@
|
||||
use crate::types::Map;
|
||||
use crate::xml_parser::{parse_maps_xml, XmlParseError};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// A database for managing Maps loaded from XML files
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MapDatabase {
|
||||
maps: Vec<Map>,
|
||||
// Map scene_id -> map index
|
||||
maps_by_scene_id: HashMap<String, usize>,
|
||||
// Map name -> list of map indices (multiple maps can have same name)
|
||||
maps_by_name: HashMap<String, Vec<usize>>,
|
||||
// Map coordinates (x,y) -> map index
|
||||
maps_by_coords: HashMap<(i32, i32), usize>,
|
||||
}
|
||||
|
||||
impl MapDatabase {
|
||||
/// Create a new empty MapDatabase
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
maps: Vec::new(),
|
||||
maps_by_scene_id: HashMap::new(),
|
||||
maps_by_name: HashMap::new(),
|
||||
maps_by_coords: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Load maps from an XML file
|
||||
pub fn load_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let maps = parse_maps_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_maps(maps);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Add maps to the database
|
||||
pub fn add_maps(&mut self, maps: Vec<Map>) {
|
||||
for map in maps {
|
||||
let index = self.maps.len();
|
||||
|
||||
// Index by scene ID
|
||||
self.maps_by_scene_id.insert(map.scene_id.clone(), index);
|
||||
|
||||
// Index by name (if it has a name)
|
||||
if !map.name.is_empty() {
|
||||
self.maps_by_name
|
||||
.entry(map.name.clone())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
}
|
||||
|
||||
// Index by coordinates
|
||||
if let Some(coords) = map.get_coordinates() {
|
||||
self.maps_by_coords.insert(coords, index);
|
||||
}
|
||||
|
||||
self.maps.push(map);
|
||||
}
|
||||
}
|
||||
|
||||
/// Get a map by scene ID (e.g., "3,10")
|
||||
pub fn get_by_scene_id(&self, scene_id: &str) -> Option<&Map> {
|
||||
self.maps_by_scene_id
|
||||
.get(scene_id)
|
||||
.and_then(|&index| self.maps.get(index))
|
||||
}
|
||||
|
||||
/// Get a map by coordinates
|
||||
pub fn get_by_coords(&self, x: i32, y: i32) -> Option<&Map> {
|
||||
self.maps_by_coords
|
||||
.get(&(x, y))
|
||||
.and_then(|&index| self.maps.get(index))
|
||||
}
|
||||
|
||||
/// Get maps by name (returns all maps with matching name)
|
||||
pub fn get_by_name(&self, name: &str) -> Vec<&Map> {
|
||||
self.maps_by_name
|
||||
.get(name)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.maps.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all maps
|
||||
pub fn all_maps(&self) -> &[Map] {
|
||||
&self.maps
|
||||
}
|
||||
|
||||
/// Get all named maps (maps with non-empty names)
|
||||
pub fn get_named_maps(&self) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.is_named()).collect()
|
||||
}
|
||||
|
||||
/// Get all indoor maps
|
||||
pub fn get_indoor_maps(&self) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.is_indoor()).collect()
|
||||
}
|
||||
|
||||
/// Get all maps that are isolated (don't load nearby scenes)
|
||||
pub fn get_isolated_maps(&self) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.is_isolated()).collect()
|
||||
}
|
||||
|
||||
/// Get all maps with border restrictions
|
||||
pub fn get_bordered_maps(&self) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.has_borders()).collect()
|
||||
}
|
||||
|
||||
/// Get maps by music track ID
|
||||
pub fn get_by_music(&self, music_id: i32) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.music == music_id).collect()
|
||||
}
|
||||
|
||||
/// Get maps by ambience ID
|
||||
pub fn get_by_ambience(&self, ambience_id: i32) -> Vec<&Map> {
|
||||
self.maps
|
||||
.iter()
|
||||
.filter(|m| m.ambience == ambience_id)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all maps that have a respawn location set
|
||||
pub fn get_maps_with_respawn(&self) -> Vec<&Map> {
|
||||
self.maps
|
||||
.iter()
|
||||
.filter(|m| m.respawn_map.is_some())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all maps that are connected to other maps
|
||||
pub fn get_connected_maps(&self) -> Vec<&Map> {
|
||||
self.maps
|
||||
.iter()
|
||||
.filter(|m| m.connected_maps.is_some())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all maps hidden from world map
|
||||
pub fn get_hidden_from_worldmap(&self) -> Vec<&Map> {
|
||||
self.maps.iter().filter(|m| m.no_world_map).collect()
|
||||
}
|
||||
|
||||
/// Get all unique map names
|
||||
pub fn get_all_map_names(&self) -> Vec<String> {
|
||||
self.maps_by_name.keys().cloned().collect()
|
||||
}
|
||||
|
||||
/// Get the bounds of the map grid (min/max x and y coordinates)
|
||||
pub fn get_map_bounds(&self) -> Option<((i32, i32), (i32, i32))> {
|
||||
let coords: Vec<(i32, i32)> = self.maps_by_coords.keys().copied().collect();
|
||||
|
||||
if coords.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let min_x = coords.iter().map(|(x, _)| *x).min()?;
|
||||
let max_x = coords.iter().map(|(x, _)| *x).max()?;
|
||||
let min_y = coords.iter().map(|(_, y)| *y).min()?;
|
||||
let max_y = coords.iter().map(|(_, y)| *y).max()?;
|
||||
|
||||
Some(((min_x, min_y), (max_x, max_y)))
|
||||
}
|
||||
|
||||
/// Get number of maps in database
|
||||
pub fn len(&self) -> usize {
|
||||
self.maps.len()
|
||||
}
|
||||
|
||||
/// Check if database is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.maps.is_empty()
|
||||
}
|
||||
|
||||
/// Prepare maps for SQL insertion
|
||||
/// Returns a vector of tuples (scene_id, name, json_data)
|
||||
pub fn prepare_for_sql(&self) -> Vec<(String, String, String)> {
|
||||
self.maps
|
||||
.iter()
|
||||
.map(|map| {
|
||||
let json = serde_json::to_string(map).unwrap_or_else(|_| "{}".to_string());
|
||||
(map.scene_id.clone(), map.name.clone(), json)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for MapDatabase {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_map_database_basic() {
|
||||
let mut db = MapDatabase::new();
|
||||
assert!(db.is_empty());
|
||||
assert_eq!(db.len(), 0);
|
||||
}
|
||||
}
|
||||
@@ -3,9 +3,19 @@ mod npc_database;
|
||||
mod quest_database;
|
||||
mod harvestable_database;
|
||||
mod loot_database;
|
||||
mod map_database;
|
||||
mod fast_travel_database;
|
||||
mod player_house_database;
|
||||
mod trait_database;
|
||||
mod shop_database;
|
||||
|
||||
pub use item_database::ItemDatabase;
|
||||
pub use npc_database::NpcDatabase;
|
||||
pub use quest_database::QuestDatabase;
|
||||
pub use harvestable_database::HarvestableDatabase;
|
||||
pub use loot_database::LootDatabase;
|
||||
pub use map_database::MapDatabase;
|
||||
pub use fast_travel_database::FastTravelDatabase;
|
||||
pub use player_house_database::PlayerHouseDatabase;
|
||||
pub use trait_database::TraitDatabase;
|
||||
pub use shop_database::ShopDatabase;
|
||||
|
||||
182
cursebreaker-parser/src/databases/player_house_database.rs
Normal file
182
cursebreaker-parser/src/databases/player_house_database.rs
Normal file
@@ -0,0 +1,182 @@
|
||||
use crate::types::PlayerHouse;
|
||||
use crate::xml_parser::{parse_player_houses_xml, XmlParseError};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// A database for managing Player Houses loaded from XML files
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct PlayerHouseDatabase {
|
||||
houses: Vec<PlayerHouse>,
|
||||
// Map ID -> house index
|
||||
houses_by_id: HashMap<i32, usize>,
|
||||
// Map name -> list of house indices (multiple houses can have same name)
|
||||
houses_by_name: HashMap<String, Vec<usize>>,
|
||||
}
|
||||
|
||||
impl PlayerHouseDatabase {
|
||||
/// Create a new empty PlayerHouseDatabase
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
houses: Vec::new(),
|
||||
houses_by_id: HashMap::new(),
|
||||
houses_by_name: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Load player houses from an XML file
|
||||
pub fn load_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let houses = parse_player_houses_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_houses(houses);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Add player houses to the database
|
||||
pub fn add_houses(&mut self, houses: Vec<PlayerHouse>) {
|
||||
for house in houses {
|
||||
let index = self.houses.len();
|
||||
|
||||
// Index by ID
|
||||
self.houses_by_id.insert(house.id, index);
|
||||
|
||||
// Index by name
|
||||
self.houses_by_name
|
||||
.entry(house.name.clone())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
|
||||
self.houses.push(house);
|
||||
}
|
||||
}
|
||||
|
||||
/// Get a player house by ID
|
||||
pub fn get_by_id(&self, id: i32) -> Option<&PlayerHouse> {
|
||||
self.houses_by_id
|
||||
.get(&id)
|
||||
.and_then(|&index| self.houses.get(index))
|
||||
}
|
||||
|
||||
/// Get player houses by name (returns all houses with matching name)
|
||||
pub fn get_by_name(&self, name: &str) -> Vec<&PlayerHouse> {
|
||||
self.houses_by_name
|
||||
.get(name)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.houses.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all houses
|
||||
pub fn all_houses(&self) -> &[PlayerHouse] {
|
||||
&self.houses
|
||||
}
|
||||
|
||||
/// Get all visible houses (not hidden)
|
||||
pub fn get_visible_houses(&self) -> Vec<&PlayerHouse> {
|
||||
self.houses.iter().filter(|h| h.is_visible()).collect()
|
||||
}
|
||||
|
||||
/// Get all hidden houses
|
||||
pub fn get_hidden_houses(&self) -> Vec<&PlayerHouse> {
|
||||
self.houses.iter().filter(|h| h.hidden).collect()
|
||||
}
|
||||
|
||||
/// Get all free houses (price is 0)
|
||||
pub fn get_free_houses(&self) -> Vec<&PlayerHouse> {
|
||||
self.houses.iter().filter(|h| h.is_free()).collect()
|
||||
}
|
||||
|
||||
/// Get all affordable houses (price < 5000)
|
||||
pub fn get_affordable_houses(&self) -> Vec<&PlayerHouse> {
|
||||
self.houses.iter().filter(|h| h.is_affordable()).collect()
|
||||
}
|
||||
|
||||
/// Get all expensive houses (price >= 10000)
|
||||
pub fn get_expensive_houses(&self) -> Vec<&PlayerHouse> {
|
||||
self.houses.iter().filter(|h| h.is_expensive()).collect()
|
||||
}
|
||||
|
||||
/// Get houses by price tier (0: free, 1: cheap, 2: moderate, 3: expensive)
|
||||
pub fn get_by_price_tier(&self, tier: u8) -> Vec<&PlayerHouse> {
|
||||
self.houses
|
||||
.iter()
|
||||
.filter(|h| h.get_price_tier() == tier)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get houses within a price range (inclusive)
|
||||
pub fn get_by_price_range(&self, min_price: i32, max_price: i32) -> Vec<&PlayerHouse> {
|
||||
self.houses
|
||||
.iter()
|
||||
.filter(|h| h.price >= min_price && h.price <= max_price)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all houses sorted by price (ascending)
|
||||
pub fn get_sorted_by_price(&self) -> Vec<&PlayerHouse> {
|
||||
let mut houses: Vec<&PlayerHouse> = self.houses.iter().collect();
|
||||
houses.sort_by_key(|h| h.price);
|
||||
houses
|
||||
}
|
||||
|
||||
/// Get the cheapest house (excluding free houses)
|
||||
pub fn get_cheapest(&self) -> Option<&PlayerHouse> {
|
||||
self.houses
|
||||
.iter()
|
||||
.filter(|h| h.price > 0)
|
||||
.min_by_key(|h| h.price)
|
||||
}
|
||||
|
||||
/// Get the most expensive house
|
||||
pub fn get_most_expensive(&self) -> Option<&PlayerHouse> {
|
||||
self.houses.iter().max_by_key(|h| h.price)
|
||||
}
|
||||
|
||||
/// Get all unique house names
|
||||
pub fn get_all_names(&self) -> Vec<String> {
|
||||
self.houses_by_name.keys().cloned().collect()
|
||||
}
|
||||
|
||||
/// Get number of houses in database
|
||||
pub fn len(&self) -> usize {
|
||||
self.houses.len()
|
||||
}
|
||||
|
||||
/// Check if database is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.houses.is_empty()
|
||||
}
|
||||
|
||||
/// Prepare player houses for SQL insertion
|
||||
/// Returns a vector of tuples (id, name, price, json_data)
|
||||
pub fn prepare_for_sql(&self) -> Vec<(i32, String, i32, String)> {
|
||||
self.houses
|
||||
.iter()
|
||||
.map(|house| {
|
||||
let json = serde_json::to_string(house).unwrap_or_else(|_| "{}".to_string());
|
||||
(house.id, house.name.clone(), house.price, json)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for PlayerHouseDatabase {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_player_house_database_basic() {
|
||||
let mut db = PlayerHouseDatabase::new();
|
||||
assert!(db.is_empty());
|
||||
assert_eq!(db.len(), 0);
|
||||
}
|
||||
}
|
||||
181
cursebreaker-parser/src/databases/shop_database.rs
Normal file
181
cursebreaker-parser/src/databases/shop_database.rs
Normal file
@@ -0,0 +1,181 @@
|
||||
use crate::types::Shop;
|
||||
use crate::xml_parser::{parse_shops_xml, XmlParseError};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// A database for managing Shops loaded from XML files
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ShopDatabase {
|
||||
shops: Vec<Shop>,
|
||||
// Map shop_id -> shop index
|
||||
shops_by_id: HashMap<i32, usize>,
|
||||
// Map name -> list of shop indices
|
||||
shops_by_name: HashMap<String, Vec<usize>>,
|
||||
}
|
||||
|
||||
impl ShopDatabase {
|
||||
/// Create a new empty ShopDatabase
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
shops: Vec::new(),
|
||||
shops_by_id: HashMap::new(),
|
||||
shops_by_name: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Load shops from an XML file
|
||||
pub fn load_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let shops = parse_shops_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_shops(shops);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Add shops to the database
|
||||
pub fn add_shops(&mut self, shops: Vec<Shop>) {
|
||||
for shop in shops {
|
||||
let index = self.shops.len();
|
||||
|
||||
// Index by ID
|
||||
self.shops_by_id.insert(shop.shop_id, index);
|
||||
|
||||
// Index by name
|
||||
self.shops_by_name
|
||||
.entry(shop.name.clone())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
|
||||
self.shops.push(shop);
|
||||
}
|
||||
}
|
||||
|
||||
/// Get a shop by ID
|
||||
pub fn get_by_id(&self, shop_id: i32) -> Option<&Shop> {
|
||||
self.shops_by_id
|
||||
.get(&shop_id)
|
||||
.and_then(|&index| self.shops.get(index))
|
||||
}
|
||||
|
||||
/// Get shops by name (returns all shops with matching name)
|
||||
pub fn get_by_name(&self, name: &str) -> Vec<&Shop> {
|
||||
self.shops_by_name
|
||||
.get(name)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.shops.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all shops
|
||||
pub fn all_shops(&self) -> &[Shop] {
|
||||
&self.shops
|
||||
}
|
||||
|
||||
/// Get all general stores
|
||||
pub fn get_general_stores(&self) -> Vec<&Shop> {
|
||||
self.shops
|
||||
.iter()
|
||||
.filter(|s| s.is_general_store)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all specialized shops (non-general stores)
|
||||
pub fn get_specialized_shops(&self) -> Vec<&Shop> {
|
||||
self.shops
|
||||
.iter()
|
||||
.filter(|s| !s.is_general_store)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all non-empty shops
|
||||
pub fn get_non_empty_shops(&self) -> Vec<&Shop> {
|
||||
self.shops.iter().filter(|s| !s.is_empty()).collect()
|
||||
}
|
||||
|
||||
/// Get all shops that sell a specific item ID
|
||||
pub fn get_shops_selling_item(&self, item_id: &str) -> Vec<&Shop> {
|
||||
self.shops
|
||||
.iter()
|
||||
.filter(|shop| shop.get_item_by_id(item_id).is_some())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all shops with comments
|
||||
pub fn get_shops_with_comments(&self) -> Vec<&Shop> {
|
||||
self.shops
|
||||
.iter()
|
||||
.filter(|s| s.comment.is_some())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all unique shop names
|
||||
pub fn get_all_names(&self) -> Vec<String> {
|
||||
self.shops_by_name.keys().cloned().collect()
|
||||
}
|
||||
|
||||
/// Get total number of items across all shops
|
||||
pub fn total_item_count(&self) -> usize {
|
||||
self.shops.iter().map(|s| s.item_count()).sum()
|
||||
}
|
||||
|
||||
/// Get all unique item IDs sold across all shops
|
||||
pub fn get_all_item_ids(&self) -> Vec<String> {
|
||||
let mut item_ids: Vec<String> = self
|
||||
.shops
|
||||
.iter()
|
||||
.flat_map(|shop| shop.get_all_item_ids())
|
||||
.collect();
|
||||
item_ids.sort();
|
||||
item_ids.dedup();
|
||||
item_ids
|
||||
}
|
||||
|
||||
/// Get number of shops in database
|
||||
pub fn len(&self) -> usize {
|
||||
self.shops.len()
|
||||
}
|
||||
|
||||
/// Check if database is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.shops.is_empty()
|
||||
}
|
||||
|
||||
/// Prepare shops for SQL insertion
|
||||
/// Returns a vector of tuples (shop_id, name, is_general_store, item_count, json_data)
|
||||
pub fn prepare_for_sql(&self) -> Vec<(i32, String, bool, usize, String)> {
|
||||
self.shops
|
||||
.iter()
|
||||
.map(|shop| {
|
||||
let json = serde_json::to_string(shop).unwrap_or_else(|_| "{}".to_string());
|
||||
(
|
||||
shop.shop_id,
|
||||
shop.name.clone(),
|
||||
shop.is_general_store,
|
||||
shop.item_count(),
|
||||
json,
|
||||
)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for ShopDatabase {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_shop_database_basic() {
|
||||
let mut db = ShopDatabase::new();
|
||||
assert!(db.is_empty());
|
||||
assert_eq!(db.len(), 0);
|
||||
}
|
||||
}
|
||||
207
cursebreaker-parser/src/databases/trait_database.rs
Normal file
207
cursebreaker-parser/src/databases/trait_database.rs
Normal file
@@ -0,0 +1,207 @@
|
||||
use crate::types::Trait;
|
||||
use crate::xml_parser::{parse_traits_xml, XmlParseError};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// A database for managing Traits loaded from XML files
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct TraitDatabase {
|
||||
traits: Vec<Trait>,
|
||||
// Map ID -> trait index
|
||||
traits_by_id: HashMap<i32, usize>,
|
||||
// Map name -> list of trait indices
|
||||
traits_by_name: HashMap<String, Vec<usize>>,
|
||||
// Map skill -> list of trait indices
|
||||
traits_by_skill: HashMap<String, Vec<usize>>,
|
||||
}
|
||||
|
||||
impl TraitDatabase {
|
||||
/// Create a new empty TraitDatabase
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
traits: Vec::new(),
|
||||
traits_by_id: HashMap::new(),
|
||||
traits_by_name: HashMap::new(),
|
||||
traits_by_skill: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Load traits from an XML file
|
||||
pub fn load_from_xml<P: AsRef<Path>>(path: P) -> Result<Self, XmlParseError> {
|
||||
let traits = parse_traits_xml(path)?;
|
||||
let mut db = Self::new();
|
||||
db.add_traits(traits);
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Add traits to the database
|
||||
pub fn add_traits(&mut self, traits: Vec<Trait>) {
|
||||
for trait_obj in traits {
|
||||
let index = self.traits.len();
|
||||
|
||||
// Index by ID
|
||||
self.traits_by_id.insert(trait_obj.id, index);
|
||||
|
||||
// Index by name (if it has a name)
|
||||
if !trait_obj.name.is_empty() {
|
||||
self.traits_by_name
|
||||
.entry(trait_obj.name.clone())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
}
|
||||
|
||||
// Index by skill (if it has a trainer requirement)
|
||||
if let Some(ref trainer) = trait_obj.trainer {
|
||||
self.traits_by_skill
|
||||
.entry(trainer.skill.clone().to_lowercase())
|
||||
.or_insert_with(Vec::new)
|
||||
.push(index);
|
||||
}
|
||||
|
||||
self.traits.push(trait_obj);
|
||||
}
|
||||
}
|
||||
|
||||
/// Get a trait by ID
|
||||
pub fn get_by_id(&self, id: i32) -> Option<&Trait> {
|
||||
self.traits_by_id
|
||||
.get(&id)
|
||||
.and_then(|&index| self.traits.get(index))
|
||||
}
|
||||
|
||||
/// Get traits by name (returns all traits with matching name)
|
||||
pub fn get_by_name(&self, name: &str) -> Vec<&Trait> {
|
||||
self.traits_by_name
|
||||
.get(name)
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.traits.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all traits
|
||||
pub fn all_traits(&self) -> &[Trait] {
|
||||
&self.traits
|
||||
}
|
||||
|
||||
/// Get all traits for a specific skill
|
||||
pub fn get_by_skill(&self, skill: &str) -> Vec<&Trait> {
|
||||
self.traits_by_skill
|
||||
.get(&skill.to_lowercase())
|
||||
.map(|indices| {
|
||||
indices
|
||||
.iter()
|
||||
.filter_map(|&index| self.traits.get(index))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get all trainer traits (traits that require a trainer)
|
||||
pub fn get_trainer_traits(&self) -> Vec<&Trait> {
|
||||
self.traits.iter().filter(|t| t.is_trainer_trait()).collect()
|
||||
}
|
||||
|
||||
/// Get all traits that teach abilities
|
||||
pub fn get_ability_traits(&self) -> Vec<&Trait> {
|
||||
self.traits
|
||||
.iter()
|
||||
.filter(|t| t.teaches_ability())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all novice tier traits
|
||||
pub fn get_novice_traits(&self) -> Vec<&Trait> {
|
||||
self.traits.iter().filter(|t| t.is_novice()).collect()
|
||||
}
|
||||
|
||||
/// Get all experienced tier traits
|
||||
pub fn get_experienced_traits(&self) -> Vec<&Trait> {
|
||||
self.traits.iter().filter(|t| t.is_experienced()).collect()
|
||||
}
|
||||
|
||||
/// Get all master tier traits
|
||||
pub fn get_master_traits(&self) -> Vec<&Trait> {
|
||||
self.traits.iter().filter(|t| t.is_master()).collect()
|
||||
}
|
||||
|
||||
/// Get traits by level requirement for a specific skill
|
||||
pub fn get_by_skill_and_level(&self, skill: &str, min_level: i32, max_level: i32) -> Vec<&Trait> {
|
||||
self.get_by_skill(skill)
|
||||
.into_iter()
|
||||
.filter(|t| {
|
||||
if let Some(level) = t.get_required_level() {
|
||||
level >= min_level && level <= max_level
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all unique skill names
|
||||
pub fn get_all_skills(&self) -> Vec<String> {
|
||||
self.traits_by_skill.keys().cloned().collect()
|
||||
}
|
||||
|
||||
/// Get traits sorted by level for a specific skill
|
||||
pub fn get_sorted_by_level(&self, skill: &str) -> Vec<&Trait> {
|
||||
let mut traits = self.get_by_skill(skill);
|
||||
traits.sort_by_key(|t| t.get_required_level().unwrap_or(0));
|
||||
traits
|
||||
}
|
||||
|
||||
/// Get all traits with comments
|
||||
pub fn get_with_comments(&self) -> Vec<&Trait> {
|
||||
self.traits
|
||||
.iter()
|
||||
.filter(|t| t.comment.is_some())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get number of traits in database
|
||||
pub fn len(&self) -> usize {
|
||||
self.traits.len()
|
||||
}
|
||||
|
||||
/// Check if database is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.traits.is_empty()
|
||||
}
|
||||
|
||||
/// Prepare traits for SQL insertion
|
||||
/// Returns a vector of tuples (id, name, skill, level, json_data)
|
||||
pub fn prepare_for_sql(&self) -> Vec<(i32, String, Option<String>, Option<i32>, String)> {
|
||||
self.traits
|
||||
.iter()
|
||||
.map(|trait_obj| {
|
||||
let json =
|
||||
serde_json::to_string(trait_obj).unwrap_or_else(|_| "{}".to_string());
|
||||
let skill = trait_obj.get_required_skill().map(|s| s.to_string());
|
||||
let level = trait_obj.get_required_level();
|
||||
(trait_obj.id, trait_obj.name.clone(), skill, level, json)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for TraitDatabase {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_trait_database_basic() {
|
||||
let mut db = TraitDatabase::new();
|
||||
assert!(db.is_empty());
|
||||
assert_eq!(db.len(), 0);
|
||||
}
|
||||
}
|
||||
@@ -60,6 +60,11 @@ pub use databases::{
|
||||
QuestDatabase,
|
||||
HarvestableDatabase,
|
||||
LootDatabase,
|
||||
MapDatabase,
|
||||
FastTravelDatabase,
|
||||
PlayerHouseDatabase,
|
||||
TraitDatabase,
|
||||
ShopDatabase,
|
||||
};
|
||||
pub use types::{
|
||||
// Items
|
||||
@@ -96,5 +101,13 @@ pub use types::{
|
||||
HarvestableDrop,
|
||||
LootTable,
|
||||
LootDrop,
|
||||
Map,
|
||||
FastTravelLocation,
|
||||
FastTravelType,
|
||||
PlayerHouse,
|
||||
Trait,
|
||||
TraitTrainer,
|
||||
Shop,
|
||||
ShopItem,
|
||||
};
|
||||
pub use xml_parser::XmlParseError;
|
||||
|
||||
152
cursebreaker-parser/src/types/cursebreaker/fast_travel.rs
Normal file
152
cursebreaker-parser/src/types/cursebreaker/fast_travel.rs
Normal file
@@ -0,0 +1,152 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Type of fast travel location
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum FastTravelType {
|
||||
/// Regular fast travel location (horse/cart)
|
||||
Location,
|
||||
/// Canoe fast travel location (water travel)
|
||||
Canoe,
|
||||
/// Portal fast travel location (magical portal)
|
||||
Portal,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for FastTravelType {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
FastTravelType::Location => write!(f, "Location"),
|
||||
FastTravelType::Canoe => write!(f, "Canoe"),
|
||||
FastTravelType::Portal => write!(f, "Portal"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents a fast travel location (canoe, portal, or regular location)
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FastTravelLocation {
|
||||
/// Unique ID
|
||||
pub id: i32,
|
||||
|
||||
/// Display name
|
||||
pub name: String,
|
||||
|
||||
/// 3D position in world space (x,y,z)
|
||||
pub position: String,
|
||||
|
||||
/// Type of fast travel
|
||||
pub travel_type: FastTravelType,
|
||||
|
||||
// ========== Optional Fields ==========
|
||||
/// Whether this location is unlocked by default (FastTravelLocations only)
|
||||
pub unlocked: bool,
|
||||
|
||||
/// Connected location IDs (FastTravelLocations only, comma-separated)
|
||||
pub connections: Option<String>,
|
||||
|
||||
/// Requirement checks (FastTravelCanoe only, e.g., "Quest=70-2-999,HasTrait=273")
|
||||
pub checks: Option<String>,
|
||||
}
|
||||
|
||||
impl FastTravelLocation {
|
||||
/// Create a new FastTravelLocation with required fields
|
||||
pub fn new(id: i32, name: String, position: String, travel_type: FastTravelType) -> Self {
|
||||
Self {
|
||||
id,
|
||||
name,
|
||||
position,
|
||||
travel_type,
|
||||
unlocked: false,
|
||||
connections: None,
|
||||
checks: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse position into (x, y, z) coordinates
|
||||
pub fn get_position(&self) -> Option<(f32, f32, f32)> {
|
||||
let parts: Vec<&str> = self.position.split(',').collect();
|
||||
if parts.len() == 3 {
|
||||
if let (Ok(x), Ok(y), Ok(z)) = (
|
||||
parts[0].parse::<f32>(),
|
||||
parts[1].parse::<f32>(),
|
||||
parts[2].parse::<f32>(),
|
||||
) {
|
||||
return Some((x, y, z));
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Get list of connected location IDs
|
||||
pub fn get_connections(&self) -> Vec<i32> {
|
||||
if let Some(ref connections) = self.connections {
|
||||
connections
|
||||
.split(',')
|
||||
.filter_map(|s| s.trim().parse::<i32>().ok())
|
||||
.collect()
|
||||
} else {
|
||||
Vec::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if this location has any requirements
|
||||
pub fn has_requirements(&self) -> bool {
|
||||
self.checks.is_some()
|
||||
}
|
||||
|
||||
/// Check if this location has connections to other locations
|
||||
pub fn has_connections(&self) -> bool {
|
||||
self.connections.is_some() && !self.get_connections().is_empty()
|
||||
}
|
||||
|
||||
/// Parse checks into a list of individual requirements
|
||||
/// Returns Vec of (check_type, value) tuples
|
||||
/// e.g., "Quest=70-2-999,HasTrait=273" -> [("Quest", "70-2-999"), ("HasTrait", "273")]
|
||||
pub fn parse_checks(&self) -> Vec<(String, String)> {
|
||||
if let Some(ref checks) = self.checks {
|
||||
checks
|
||||
.split(',')
|
||||
.filter_map(|check| {
|
||||
let parts: Vec<&str> = check.trim().split('=').collect();
|
||||
if parts.len() == 2 {
|
||||
Some((parts[0].to_string(), parts[1].to_string()))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
} else {
|
||||
Vec::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if this location requires a specific quest
|
||||
pub fn requires_quest(&self, quest_id: &str) -> bool {
|
||||
self.parse_checks()
|
||||
.iter()
|
||||
.any(|(check_type, value)| check_type == "Quest" && value.starts_with(quest_id))
|
||||
}
|
||||
|
||||
/// Check if this location requires a specific trait
|
||||
pub fn requires_trait(&self, trait_id: i32) -> bool {
|
||||
self.parse_checks()
|
||||
.iter()
|
||||
.any(|(check_type, value)| {
|
||||
check_type == "HasTrait" && value.parse::<i32>().ok() == Some(trait_id)
|
||||
})
|
||||
}
|
||||
|
||||
/// Check if location is a canoe location
|
||||
pub fn is_canoe(&self) -> bool {
|
||||
self.travel_type == FastTravelType::Canoe
|
||||
}
|
||||
|
||||
/// Check if location is a portal
|
||||
pub fn is_portal(&self) -> bool {
|
||||
self.travel_type == FastTravelType::Portal
|
||||
}
|
||||
|
||||
/// Check if location is a regular location
|
||||
pub fn is_location(&self) -> bool {
|
||||
self.travel_type == FastTravelType::Location
|
||||
}
|
||||
}
|
||||
215
cursebreaker-parser/src/types/cursebreaker/map.rs
Normal file
215
cursebreaker-parser/src/types/cursebreaker/map.rs
Normal file
@@ -0,0 +1,215 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Represents a single map/scene in the game world
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct Map {
|
||||
/// Scene ID in "x,y" format (e.g., "0,0", "3,10")
|
||||
pub scene_id: String,
|
||||
|
||||
/// Display name of the map (can be empty)
|
||||
pub name: String,
|
||||
|
||||
/// Music track ID
|
||||
pub music: i32,
|
||||
|
||||
/// Ambient sound ID
|
||||
pub ambience: i32,
|
||||
|
||||
// ========== Lighting & Atmosphere ==========
|
||||
/// Fog color in "r,g,b" format (default from comment: "63,98,106")
|
||||
pub fog_color: Option<String>,
|
||||
|
||||
/// Fogginess/fog density (default: 0.6)
|
||||
pub fogginess: Option<f32>,
|
||||
|
||||
/// View distance (default: 90)
|
||||
pub view_distance: Option<i32>,
|
||||
|
||||
/// NPC view distance (default: 50)
|
||||
pub npc_view_distance: Option<i32>,
|
||||
|
||||
/// Sunlight intensity (default: 1)
|
||||
pub sunlight: Option<f32>,
|
||||
|
||||
/// Sun color in "r,g,b" format (default: "255,251,230")
|
||||
pub sun_color: Option<String>,
|
||||
|
||||
/// Ambient color in "r,g,b" format (default: "128,128,128")
|
||||
pub ambient_color: Option<String>,
|
||||
|
||||
/// Indoor sunlight level (default: 0.2)
|
||||
pub indoor_sunlight: Option<f32>,
|
||||
|
||||
/// Fog start distance
|
||||
pub fog_start: Option<f32>,
|
||||
|
||||
// ========== Map Properties ==========
|
||||
/// Whether this is an indoor map
|
||||
pub indoors: bool,
|
||||
|
||||
/// Whether to hide this map from the world map
|
||||
pub no_world_map: bool,
|
||||
|
||||
/// Whether to hide the minimap
|
||||
pub no_minimap: bool,
|
||||
|
||||
/// Whether teleportation is disabled
|
||||
pub tp_disabled: bool,
|
||||
|
||||
/// Whether to prevent loading nearby scenes
|
||||
pub dont_load_nearby_scenes: bool,
|
||||
|
||||
/// Remove all borders
|
||||
pub no_border: bool,
|
||||
|
||||
/// Block movement at left edge
|
||||
pub border_left: bool,
|
||||
|
||||
/// Block movement at right edge
|
||||
pub border_right: bool,
|
||||
|
||||
/// Block movement at top edge
|
||||
pub border_up: bool,
|
||||
|
||||
/// Block movement at bottom edge
|
||||
pub border_down: bool,
|
||||
|
||||
// ========== Connectivity ==========
|
||||
/// Scene ID to respawn at (e.g., "3,10")
|
||||
pub respawn_map: Option<String>,
|
||||
|
||||
/// Connected maps in "x-y,x-y" format (e.g., "5-13,5-14")
|
||||
pub connected_maps: Option<String>,
|
||||
|
||||
// ========== Metadata ==========
|
||||
/// Developer comment/note
|
||||
pub comment: Option<String>,
|
||||
}
|
||||
|
||||
impl Map {
|
||||
/// Create a new Map with required fields
|
||||
pub fn new(scene_id: String, music: i32, ambience: i32) -> Self {
|
||||
Self {
|
||||
scene_id,
|
||||
name: String::new(),
|
||||
music,
|
||||
ambience,
|
||||
fog_color: None,
|
||||
fogginess: None,
|
||||
view_distance: None,
|
||||
npc_view_distance: None,
|
||||
sunlight: None,
|
||||
sun_color: None,
|
||||
ambient_color: None,
|
||||
indoor_sunlight: None,
|
||||
fog_start: None,
|
||||
indoors: false,
|
||||
no_world_map: false,
|
||||
no_minimap: false,
|
||||
tp_disabled: false,
|
||||
dont_load_nearby_scenes: false,
|
||||
no_border: false,
|
||||
border_left: false,
|
||||
border_right: false,
|
||||
border_up: false,
|
||||
border_down: false,
|
||||
respawn_map: None,
|
||||
connected_maps: None,
|
||||
comment: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse scene ID into (x, y) coordinates
|
||||
pub fn get_coordinates(&self) -> Option<(i32, i32)> {
|
||||
let parts: Vec<&str> = self.scene_id.split(',').collect();
|
||||
if parts.len() == 2 {
|
||||
if let (Ok(x), Ok(y)) = (parts[0].parse::<i32>(), parts[1].parse::<i32>()) {
|
||||
return Some((x, y));
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Check if the map is named (has a non-empty name)
|
||||
pub fn is_named(&self) -> bool {
|
||||
!self.name.is_empty()
|
||||
}
|
||||
|
||||
/// Check if the map is an indoor location
|
||||
pub fn is_indoor(&self) -> bool {
|
||||
self.indoors
|
||||
}
|
||||
|
||||
/// Check if the map has any border restrictions
|
||||
pub fn has_borders(&self) -> bool {
|
||||
!self.no_border && (self.border_left || self.border_right || self.border_up || self.border_down)
|
||||
}
|
||||
|
||||
/// Get list of connected map scene IDs
|
||||
pub fn get_connected_map_ids(&self) -> Vec<String> {
|
||||
if let Some(ref connected) = self.connected_maps {
|
||||
connected
|
||||
.split(',')
|
||||
.map(|s| s.trim().replace('-', ","))
|
||||
.collect()
|
||||
} else {
|
||||
Vec::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if this map is isolated (doesn't load nearby scenes)
|
||||
pub fn is_isolated(&self) -> bool {
|
||||
self.dont_load_nearby_scenes
|
||||
}
|
||||
|
||||
/// Parse fog color into RGB values
|
||||
pub fn get_fog_color_rgb(&self) -> Option<(u8, u8, u8)> {
|
||||
self.fog_color.as_ref().and_then(|color| {
|
||||
let parts: Vec<&str> = color.split(',').collect();
|
||||
if parts.len() == 3 {
|
||||
if let (Ok(r), Ok(g), Ok(b)) = (
|
||||
parts[0].parse::<u8>(),
|
||||
parts[1].parse::<u8>(),
|
||||
parts[2].parse::<u8>(),
|
||||
) {
|
||||
return Some((r, g, b));
|
||||
}
|
||||
}
|
||||
None
|
||||
})
|
||||
}
|
||||
|
||||
/// Parse sun color into RGB values
|
||||
pub fn get_sun_color_rgb(&self) -> Option<(u8, u8, u8)> {
|
||||
self.sun_color.as_ref().and_then(|color| {
|
||||
let parts: Vec<&str> = color.split(',').collect();
|
||||
if parts.len() == 3 {
|
||||
if let (Ok(r), Ok(g), Ok(b)) = (
|
||||
parts[0].parse::<u8>(),
|
||||
parts[1].parse::<u8>(),
|
||||
parts[2].parse::<u8>(),
|
||||
) {
|
||||
return Some((r, g, b));
|
||||
}
|
||||
}
|
||||
None
|
||||
})
|
||||
}
|
||||
|
||||
/// Parse ambient color into RGB values
|
||||
pub fn get_ambient_color_rgb(&self) -> Option<(u8, u8, u8)> {
|
||||
self.ambient_color.as_ref().and_then(|color| {
|
||||
let parts: Vec<&str> = color.split(',').collect();
|
||||
if parts.len() == 3 {
|
||||
if let (Ok(r), Ok(g), Ok(b)) = (
|
||||
parts[0].parse::<u8>(),
|
||||
parts[1].parse::<u8>(),
|
||||
parts[2].parse::<u8>(),
|
||||
) {
|
||||
return Some((r, g, b));
|
||||
}
|
||||
}
|
||||
None
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -3,6 +3,11 @@ mod npc;
|
||||
mod quest;
|
||||
mod harvestable;
|
||||
mod loot;
|
||||
mod map;
|
||||
mod fast_travel;
|
||||
mod player_house;
|
||||
mod r#trait;
|
||||
mod shop;
|
||||
|
||||
pub use item::{
|
||||
// Main types
|
||||
@@ -30,3 +35,8 @@ pub use npc::{Npc, NpcStat, NpcLevel, RightClick, BarkGroup, Bark, QuestMarker,
|
||||
pub use quest::{Quest, QuestPhase, QuestReward};
|
||||
pub use harvestable::{Harvestable, HarvestableDrop};
|
||||
pub use loot::{LootTable, LootDrop};
|
||||
pub use map::Map;
|
||||
pub use fast_travel::{FastTravelLocation, FastTravelType};
|
||||
pub use player_house::PlayerHouse;
|
||||
pub use r#trait::{Trait, TraitTrainer};
|
||||
pub use shop::{Shop, ShopItem};
|
||||
|
||||
85
cursebreaker-parser/src/types/cursebreaker/player_house.rs
Normal file
85
cursebreaker-parser/src/types/cursebreaker/player_house.rs
Normal file
@@ -0,0 +1,85 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Represents a player house that can be purchased
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PlayerHouse {
|
||||
/// Unique ID
|
||||
pub id: i32,
|
||||
|
||||
/// Display name
|
||||
pub name: String,
|
||||
|
||||
/// Description text
|
||||
pub description: String,
|
||||
|
||||
/// 3D position in world space (x,y,z)
|
||||
pub position: String,
|
||||
|
||||
/// Purchase price in gold
|
||||
pub price: i32,
|
||||
|
||||
/// Whether this house is hidden (not shown in normal lists)
|
||||
pub hidden: bool,
|
||||
}
|
||||
|
||||
impl PlayerHouse {
|
||||
/// Create a new PlayerHouse with required fields
|
||||
pub fn new(id: i32, name: String, description: String, position: String, price: i32) -> Self {
|
||||
Self {
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
position,
|
||||
price,
|
||||
hidden: false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse position into (x, y, z) coordinates
|
||||
pub fn get_position(&self) -> Option<(f32, f32, f32)> {
|
||||
let parts: Vec<&str> = self.position.split(',').collect();
|
||||
if parts.len() == 3 {
|
||||
if let (Ok(x), Ok(y), Ok(z)) = (
|
||||
parts[0].parse::<f32>(),
|
||||
parts[1].parse::<f32>(),
|
||||
parts[2].parse::<f32>(),
|
||||
) {
|
||||
return Some((x, y, z));
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Check if this house is free (price is 0)
|
||||
pub fn is_free(&self) -> bool {
|
||||
self.price == 0
|
||||
}
|
||||
|
||||
/// Check if this house is visible (not hidden)
|
||||
pub fn is_visible(&self) -> bool {
|
||||
!self.hidden
|
||||
}
|
||||
|
||||
/// Check if this house is expensive (price >= 10000)
|
||||
pub fn is_expensive(&self) -> bool {
|
||||
self.price >= 10000
|
||||
}
|
||||
|
||||
/// Check if this house is affordable (price < 5000)
|
||||
pub fn is_affordable(&self) -> bool {
|
||||
self.price < 5000
|
||||
}
|
||||
|
||||
/// Get price tier (0: free, 1: cheap (<5k), 2: moderate (5k-10k), 3: expensive (10k+))
|
||||
pub fn get_price_tier(&self) -> u8 {
|
||||
if self.price == 0 {
|
||||
0
|
||||
} else if self.price < 5000 {
|
||||
1
|
||||
} else if self.price < 10000 {
|
||||
2
|
||||
} else {
|
||||
3
|
||||
}
|
||||
}
|
||||
}
|
||||
155
cursebreaker-parser/src/types/cursebreaker/shop.rs
Normal file
155
cursebreaker-parser/src/types/cursebreaker/shop.rs
Normal file
@@ -0,0 +1,155 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Represents an item sold in a shop
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ShopItem {
|
||||
/// Item ID (can be numeric or string reference)
|
||||
pub item_id: String,
|
||||
|
||||
/// Optional item name
|
||||
pub name: Option<String>,
|
||||
|
||||
/// Custom price (if different from item's default price)
|
||||
pub price: Option<i32>,
|
||||
|
||||
/// Maximum stock (items available before restocking)
|
||||
pub max_stock: Option<i32>,
|
||||
|
||||
/// Restock time in seconds
|
||||
pub restock_time: Option<i32>,
|
||||
|
||||
/// Buy price (price shop pays for the item)
|
||||
pub buy_price: Option<i32>,
|
||||
|
||||
/// Developer comment
|
||||
pub comment: Option<String>,
|
||||
}
|
||||
|
||||
impl ShopItem {
|
||||
/// Create a new ShopItem with required fields
|
||||
pub fn new(item_id: String) -> Self {
|
||||
Self {
|
||||
item_id,
|
||||
name: None,
|
||||
price: None,
|
||||
max_stock: None,
|
||||
restock_time: None,
|
||||
buy_price: None,
|
||||
comment: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Try to parse item_id as an integer
|
||||
pub fn get_item_id_as_int(&self) -> Option<i32> {
|
||||
self.item_id.parse().ok()
|
||||
}
|
||||
|
||||
/// Check if this item has unlimited stock
|
||||
pub fn has_unlimited_stock(&self) -> bool {
|
||||
self.max_stock.is_none() || self.max_stock == Some(0)
|
||||
}
|
||||
|
||||
/// Check if this item has custom pricing
|
||||
pub fn has_custom_price(&self) -> bool {
|
||||
self.price.is_some()
|
||||
}
|
||||
|
||||
/// Check if shop buys this item (has buy price)
|
||||
pub fn is_buyable_by_shop(&self) -> bool {
|
||||
self.buy_price.is_some()
|
||||
}
|
||||
|
||||
/// Get restock time in minutes
|
||||
pub fn get_restock_minutes(&self) -> Option<f32> {
|
||||
self.restock_time.map(|seconds| seconds as f32 / 60.0)
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents a shop
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct Shop {
|
||||
/// Shop ID
|
||||
pub shop_id: i32,
|
||||
|
||||
/// Shop name
|
||||
pub name: String,
|
||||
|
||||
/// Whether this is a general store
|
||||
pub is_general_store: bool,
|
||||
|
||||
/// Developer comment
|
||||
pub comment: Option<String>,
|
||||
|
||||
/// Items sold in this shop
|
||||
pub items: Vec<ShopItem>,
|
||||
}
|
||||
|
||||
impl Shop {
|
||||
/// Create a new Shop with required fields
|
||||
pub fn new(shop_id: i32, name: String) -> Self {
|
||||
Self {
|
||||
shop_id,
|
||||
name,
|
||||
is_general_store: false,
|
||||
comment: None,
|
||||
items: Vec::new(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Add an item to the shop
|
||||
pub fn add_item(&mut self, item: ShopItem) {
|
||||
self.items.push(item);
|
||||
}
|
||||
|
||||
/// Get number of items in shop
|
||||
pub fn item_count(&self) -> usize {
|
||||
self.items.len()
|
||||
}
|
||||
|
||||
/// Check if shop is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.items.is_empty()
|
||||
}
|
||||
|
||||
/// Get all items with unlimited stock
|
||||
pub fn get_unlimited_stock_items(&self) -> Vec<&ShopItem> {
|
||||
self.items
|
||||
.iter()
|
||||
.filter(|item| item.has_unlimited_stock())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all items with limited stock
|
||||
pub fn get_limited_stock_items(&self) -> Vec<&ShopItem> {
|
||||
self.items
|
||||
.iter()
|
||||
.filter(|item| !item.has_unlimited_stock())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all items with custom pricing
|
||||
pub fn get_custom_priced_items(&self) -> Vec<&ShopItem> {
|
||||
self.items
|
||||
.iter()
|
||||
.filter(|item| item.has_custom_price())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get all items the shop buys
|
||||
pub fn get_buyable_items(&self) -> Vec<&ShopItem> {
|
||||
self.items
|
||||
.iter()
|
||||
.filter(|item| item.is_buyable_by_shop())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get item by ID
|
||||
pub fn get_item_by_id(&self, item_id: &str) -> Option<&ShopItem> {
|
||||
self.items.iter().find(|item| item.item_id == item_id)
|
||||
}
|
||||
|
||||
/// Get all item IDs
|
||||
pub fn get_all_item_ids(&self) -> Vec<String> {
|
||||
self.items.iter().map(|item| item.item_id.clone()).collect()
|
||||
}
|
||||
}
|
||||
155
cursebreaker-parser/src/types/cursebreaker/trait.rs
Normal file
155
cursebreaker-parser/src/types/cursebreaker/trait.rs
Normal file
@@ -0,0 +1,155 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Trainer requirement for learning a trait
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct TraitTrainer {
|
||||
/// Required skill
|
||||
pub skill: String,
|
||||
|
||||
/// Required level in the skill
|
||||
pub level: i32,
|
||||
|
||||
/// Tier icon indicator (1, 2, 3 for novice, experienced, master)
|
||||
pub tier_icon: Option<i32>,
|
||||
}
|
||||
|
||||
impl TraitTrainer {
|
||||
pub fn new(skill: String, level: i32) -> Self {
|
||||
Self {
|
||||
skill,
|
||||
level,
|
||||
tier_icon: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if this is a novice tier trait (tier 1)
|
||||
pub fn is_novice(&self) -> bool {
|
||||
self.tier_icon == Some(1)
|
||||
}
|
||||
|
||||
/// Check if this is an experienced tier trait (tier 2)
|
||||
pub fn is_experienced(&self) -> bool {
|
||||
self.tier_icon == Some(2)
|
||||
}
|
||||
|
||||
/// Check if this is a master tier trait (tier 3)
|
||||
pub fn is_master(&self) -> bool {
|
||||
self.tier_icon == Some(3)
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents a character trait/perk
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct Trait {
|
||||
/// Unique ID
|
||||
pub id: i32,
|
||||
|
||||
/// Display name
|
||||
pub name: String,
|
||||
|
||||
/// Description text (may contain HTML color tags)
|
||||
pub description: String,
|
||||
|
||||
/// Learnability ID (ability that can be learned)
|
||||
pub learnability: Option<i32>,
|
||||
|
||||
/// Developer comment
|
||||
pub comment: Option<String>,
|
||||
|
||||
/// Trainer requirement (if this trait is learned from a trainer)
|
||||
pub trainer: Option<TraitTrainer>,
|
||||
}
|
||||
|
||||
impl Trait {
|
||||
/// Create a new Trait with required fields
|
||||
pub fn new(id: i32, name: String, description: String) -> Self {
|
||||
Self {
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
learnability: None,
|
||||
comment: None,
|
||||
trainer: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if this trait teaches an ability (has learnability)
|
||||
pub fn teaches_ability(&self) -> bool {
|
||||
self.learnability.is_some()
|
||||
}
|
||||
|
||||
/// Check if this trait is learned from a trainer
|
||||
pub fn is_trainer_trait(&self) -> bool {
|
||||
self.trainer.is_some()
|
||||
}
|
||||
|
||||
/// Check if this trait requires a specific skill
|
||||
pub fn requires_skill(&self, skill: &str) -> bool {
|
||||
self.trainer
|
||||
.as_ref()
|
||||
.map(|t| t.skill.eq_ignore_ascii_case(skill))
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
/// Get the required skill level, if any
|
||||
pub fn get_required_level(&self) -> Option<i32> {
|
||||
self.trainer.as_ref().map(|t| t.level)
|
||||
}
|
||||
|
||||
/// Get the required skill name, if any
|
||||
pub fn get_required_skill(&self) -> Option<&str> {
|
||||
self.trainer.as_ref().map(|t| t.skill.as_str())
|
||||
}
|
||||
|
||||
/// Check if this is a novice tier trait
|
||||
pub fn is_novice(&self) -> bool {
|
||||
self.trainer
|
||||
.as_ref()
|
||||
.map(|t| t.is_novice())
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
/// Check if this is an experienced tier trait
|
||||
pub fn is_experienced(&self) -> bool {
|
||||
self.trainer
|
||||
.as_ref()
|
||||
.map(|t| t.is_experienced())
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
/// Check if this is a master tier trait
|
||||
pub fn is_master(&self) -> bool {
|
||||
self.trainer
|
||||
.as_ref()
|
||||
.map(|t| t.is_master())
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
/// Check if this trait's description contains HTML color tags
|
||||
pub fn has_colored_description(&self) -> bool {
|
||||
self.description.contains("<color=")
|
||||
}
|
||||
|
||||
/// Strip HTML color tags from description
|
||||
pub fn get_plain_description(&self) -> String {
|
||||
let mut result = self.description.clone();
|
||||
|
||||
// Remove color tags
|
||||
while let Some(start) = result.find("<color=") {
|
||||
if let Some(end) = result[start..].find('>') {
|
||||
result.replace_range(start..start + end + 1, "");
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Remove closing color tags
|
||||
result = result.replace("</color>", "");
|
||||
|
||||
// Unescape HTML entities
|
||||
result = result.replace("<", "<");
|
||||
result = result.replace(">", ">");
|
||||
|
||||
result
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,62 @@
|
||||
/// Interactable_TeleporterTeleporter component from Cursebreaker
|
||||
///
|
||||
/// C# definition from Interactable_TeleporterTeleporter.cs:
|
||||
/// ```csharp
|
||||
/// public class Interactable_TeleporterTeleporter : MonoBehaviour
|
||||
/// {
|
||||
/// public Transform tpTransform;
|
||||
/// }
|
||||
/// ```
|
||||
use unity_parser::{UnityComponent, ComponentContext, EcsInsertable};
|
||||
use serde_yaml::Mapping;
|
||||
use sparsey::Entity;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct InteractableTeleporter {
|
||||
pub tp_transform: Option<Entity>,
|
||||
}
|
||||
|
||||
impl UnityComponent for InteractableTeleporter {
|
||||
fn parse(yaml: &Mapping, ctx: &ComponentContext) -> Option<Self> {
|
||||
// Handle transform reference linking if context is available
|
||||
if let (Some(entity), Some(linking_ctx_ref)) = (ctx.entity, ctx.linking_ctx) {
|
||||
// Extract tpTransform FileRef
|
||||
let tp_transform_ref = unity_parser::yaml_helpers::get_file_ref(yaml, "tpTransform");
|
||||
|
||||
// Register callback to resolve the transform reference
|
||||
linking_ctx_ref
|
||||
.borrow_mut()
|
||||
.register_callback(Box::new(move |world, entity_map| {
|
||||
// Get the InteractableTeleporter component
|
||||
if let Some(teleporter) = world.borrow_mut::<InteractableTeleporter>().get_mut(entity) {
|
||||
// Resolve the transform reference (might be None if unresolved)
|
||||
let resolved_transform = tp_transform_ref
|
||||
.and_then(|r| entity_map.get(&r.file_id).copied());
|
||||
teleporter.tp_transform = resolved_transform;
|
||||
}
|
||||
}));
|
||||
}
|
||||
|
||||
Some(Self {
|
||||
tp_transform: None,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl EcsInsertable for InteractableTeleporter {
|
||||
fn insert_into_world(self, world: &mut sparsey::World, entity: sparsey::Entity) {
|
||||
world.insert(entity, (self,));
|
||||
}
|
||||
}
|
||||
|
||||
// Register component with inventory
|
||||
inventory::submit! {
|
||||
unity_parser::ComponentRegistration {
|
||||
type_id: 114,
|
||||
class_name: "Interactable_TeleporterTeleporter",
|
||||
parse_and_insert: |yaml, ctx, world, entity| {
|
||||
<InteractableTeleporter as EcsInsertable>::parse_and_insert(yaml, ctx, world, entity)
|
||||
},
|
||||
register: |builder| builder.register::<InteractableTeleporter>(),
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,42 @@
|
||||
/// Interactable_Workbench component from Cursebreaker
|
||||
///
|
||||
/// C# definition from Interactable_Workbench.cs:
|
||||
/// ```csharp
|
||||
/// public class Interactable_Workbench : MonoBehaviour
|
||||
/// {
|
||||
/// public int workbenchId;
|
||||
/// }
|
||||
/// ```
|
||||
use unity_parser::{UnityComponent, ComponentContext, EcsInsertable};
|
||||
use serde_yaml::Mapping;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct InteractableWorkbench {
|
||||
pub workbench_id: i64,
|
||||
}
|
||||
|
||||
impl UnityComponent for InteractableWorkbench {
|
||||
fn parse(yaml: &Mapping, _ctx: &ComponentContext) -> Option<Self> {
|
||||
Some(Self {
|
||||
workbench_id: unity_parser::yaml_helpers::get_i64(yaml, "workbenchId").unwrap_or(0),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl EcsInsertable for InteractableWorkbench {
|
||||
fn insert_into_world(self, world: &mut sparsey::World, entity: sparsey::Entity) {
|
||||
world.insert(entity, (self,));
|
||||
}
|
||||
}
|
||||
|
||||
// Register component with inventory
|
||||
inventory::submit! {
|
||||
unity_parser::ComponentRegistration {
|
||||
type_id: 114,
|
||||
class_name: "Interactable_Workbench",
|
||||
parse_and_insert: |yaml, ctx, world, entity| {
|
||||
<InteractableWorkbench as EcsInsertable>::parse_and_insert(yaml, ctx, world, entity)
|
||||
},
|
||||
register: |builder| builder.register::<InteractableWorkbench>(),
|
||||
}
|
||||
}
|
||||
51
cursebreaker-parser/src/types/monobehaviours/loot_spawner.rs
Normal file
51
cursebreaker-parser/src/types/monobehaviours/loot_spawner.rs
Normal file
@@ -0,0 +1,51 @@
|
||||
/// LootSpawner component from Cursebreaker
|
||||
///
|
||||
/// C# definition from LootSpawner.cs:
|
||||
/// ```csharp
|
||||
/// public class LootSpawner : MonoBehaviour
|
||||
/// {
|
||||
/// public int itemId;
|
||||
/// public int amount;
|
||||
/// public int respawnTime;
|
||||
/// public string visibilityChecks;
|
||||
/// }
|
||||
/// ```
|
||||
use unity_parser::{UnityComponent, ComponentContext, EcsInsertable};
|
||||
use serde_yaml::Mapping;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct LootSpawner {
|
||||
pub item_id: i64,
|
||||
pub amount: i64,
|
||||
pub respawn_time: i64,
|
||||
pub visibility_checks: String,
|
||||
}
|
||||
|
||||
impl UnityComponent for LootSpawner {
|
||||
fn parse(yaml: &Mapping, _ctx: &ComponentContext) -> Option<Self> {
|
||||
Some(Self {
|
||||
item_id: unity_parser::yaml_helpers::get_i64(yaml, "itemId").unwrap_or(0),
|
||||
amount: unity_parser::yaml_helpers::get_i64(yaml, "amount").unwrap_or(0),
|
||||
respawn_time: unity_parser::yaml_helpers::get_i64(yaml, "respawnTime").unwrap_or(0),
|
||||
visibility_checks: unity_parser::yaml_helpers::get_string(yaml, "visibilityChecks").unwrap_or_default(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl EcsInsertable for LootSpawner {
|
||||
fn insert_into_world(self, world: &mut sparsey::World, entity: sparsey::Entity) {
|
||||
world.insert(entity, (self,));
|
||||
}
|
||||
}
|
||||
|
||||
// Register component with inventory
|
||||
inventory::submit! {
|
||||
unity_parser::ComponentRegistration {
|
||||
type_id: 114,
|
||||
class_name: "LootSpawner",
|
||||
parse_and_insert: |yaml, ctx, world, entity| {
|
||||
<LootSpawner as EcsInsertable>::parse_and_insert(yaml, ctx, world, entity)
|
||||
},
|
||||
register: |builder| builder.register::<LootSpawner>(),
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,42 @@
|
||||
/// MapNameChanger component from Cursebreaker
|
||||
///
|
||||
/// C# definition from MapNameChanger.cs:
|
||||
/// ```csharp
|
||||
/// public class MapNameChanger : MonoBehaviour
|
||||
/// {
|
||||
/// public string mapName;
|
||||
/// }
|
||||
/// ```
|
||||
use unity_parser::{UnityComponent, ComponentContext, EcsInsertable};
|
||||
use serde_yaml::Mapping;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MapNameChanger {
|
||||
pub map_name: String,
|
||||
}
|
||||
|
||||
impl UnityComponent for MapNameChanger {
|
||||
fn parse(yaml: &Mapping, _ctx: &ComponentContext) -> Option<Self> {
|
||||
Some(Self {
|
||||
map_name: unity_parser::yaml_helpers::get_string(yaml, "mapName").unwrap_or_default(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl EcsInsertable for MapNameChanger {
|
||||
fn insert_into_world(self, world: &mut sparsey::World, entity: sparsey::Entity) {
|
||||
world.insert(entity, (self,));
|
||||
}
|
||||
}
|
||||
|
||||
// Register component with inventory
|
||||
inventory::submit! {
|
||||
unity_parser::ComponentRegistration {
|
||||
type_id: 114,
|
||||
class_name: "MapNameChanger",
|
||||
parse_and_insert: |yaml, ctx, world, entity| {
|
||||
<MapNameChanger as EcsInsertable>::parse_and_insert(yaml, ctx, world, entity)
|
||||
},
|
||||
register: |builder| builder.register::<MapNameChanger>(),
|
||||
}
|
||||
}
|
||||
@@ -1,3 +1,11 @@
|
||||
mod interactable_resource;
|
||||
mod interactable_teleporter;
|
||||
mod interactable_workbench;
|
||||
mod loot_spawner;
|
||||
mod map_name_changer;
|
||||
|
||||
pub use interactable_resource::InteractableResource;
|
||||
pub use interactable_teleporter::InteractableTeleporter;
|
||||
pub use interactable_workbench::InteractableWorkbench;
|
||||
pub use loot_spawner::LootSpawner;
|
||||
pub use map_name_changer::MapNameChanger;
|
||||
|
||||
@@ -4,6 +4,11 @@ use crate::types::{
|
||||
Quest, QuestPhase, QuestReward,
|
||||
Harvestable, HarvestableDrop,
|
||||
LootTable, LootDrop,
|
||||
Map,
|
||||
FastTravelLocation, FastTravelType,
|
||||
PlayerHouse,
|
||||
Trait, TraitTrainer,
|
||||
Shop, ShopItem,
|
||||
};
|
||||
use quick_xml::events::Event;
|
||||
use quick_xml::reader::Reader;
|
||||
@@ -713,3 +718,491 @@ pub fn parse_loot_xml<P: AsRef<Path>>(path: P) -> Result<Vec<LootTable>, XmlPars
|
||||
|
||||
Ok(loot_tables)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Map Parser
|
||||
// ============================================================================
|
||||
|
||||
pub fn parse_maps_xml<P: AsRef<Path>>(path: P) -> Result<Vec<Map>, XmlParseError> {
|
||||
let file = File::open(path)?;
|
||||
let buf_reader = BufReader::new(file);
|
||||
let mut reader = Reader::from_reader(buf_reader);
|
||||
reader.config_mut().trim_text(true);
|
||||
|
||||
let mut maps = Vec::new();
|
||||
let mut buf = Vec::new();
|
||||
|
||||
loop {
|
||||
match reader.read_event_into(&mut buf) {
|
||||
Ok(Event::Start(ref e)) | Ok(Event::Empty(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"map" => {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get required attributes
|
||||
let scene_id = attrs.get("sceneid")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("sceneid".to_string()))?
|
||||
.clone();
|
||||
|
||||
let music = attrs.get("music")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("music".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("music".to_string()))?;
|
||||
|
||||
let ambience = attrs.get("ambience")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("ambience".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("ambience".to_string()))?;
|
||||
|
||||
let mut map = Map::new(scene_id, music, ambience);
|
||||
|
||||
// Parse optional attributes
|
||||
if let Some(v) = attrs.get("name") {
|
||||
map.name = v.clone();
|
||||
}
|
||||
if let Some(v) = attrs.get("fogcolor") {
|
||||
map.fog_color = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("fogginess") {
|
||||
map.fogginess = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("viewdistance") {
|
||||
map.view_distance = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("npcviewdistance") {
|
||||
map.npc_view_distance = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("sunlight") {
|
||||
map.sunlight = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("suncolor") {
|
||||
map.sun_color = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("ambientcolor") {
|
||||
map.ambient_color = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("indoorsunlight") {
|
||||
map.indoor_sunlight = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("fogstart") {
|
||||
map.fog_start = v.parse().ok();
|
||||
}
|
||||
if attrs.get("indoors").is_some() {
|
||||
map.indoors = true;
|
||||
}
|
||||
if attrs.get("noworldmap").is_some() {
|
||||
map.no_world_map = true;
|
||||
}
|
||||
if attrs.get("nominimap").is_some() {
|
||||
map.no_minimap = true;
|
||||
}
|
||||
if attrs.get("tpdisabled").is_some() {
|
||||
map.tp_disabled = true;
|
||||
}
|
||||
if attrs.get("dontloadnearbyscenes").is_some() {
|
||||
map.dont_load_nearby_scenes = true;
|
||||
}
|
||||
if attrs.get("noborder").is_some() {
|
||||
map.no_border = true;
|
||||
}
|
||||
if attrs.get("borderleft").is_some() {
|
||||
map.border_left = true;
|
||||
}
|
||||
if attrs.get("borderright").is_some() {
|
||||
map.border_right = true;
|
||||
}
|
||||
if attrs.get("borderup").is_some() {
|
||||
map.border_up = true;
|
||||
}
|
||||
if attrs.get("borderdown").is_some() {
|
||||
map.border_down = true;
|
||||
}
|
||||
if let Some(v) = attrs.get("respawnmap") {
|
||||
map.respawn_map = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("connectedmaps") {
|
||||
map.connected_maps = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("comment") {
|
||||
map.comment = Some(v.clone());
|
||||
}
|
||||
|
||||
maps.push(map);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::Eof) => break,
|
||||
Err(e) => return Err(XmlParseError::XmlError(e)),
|
||||
_ => {}
|
||||
}
|
||||
buf.clear();
|
||||
}
|
||||
|
||||
Ok(maps)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Fast Travel Parser
|
||||
// ============================================================================
|
||||
|
||||
/// Parse FastTravelLocations.xml (regular fast travel locations)
|
||||
pub fn parse_fast_travel_locations_xml<P: AsRef<Path>>(
|
||||
path: P,
|
||||
) -> Result<Vec<FastTravelLocation>, XmlParseError> {
|
||||
parse_fast_travel_xml_internal(path, FastTravelType::Location)
|
||||
}
|
||||
|
||||
/// Parse FastTravelCanoe.xml (canoe fast travel locations)
|
||||
pub fn parse_fast_travel_canoe_xml<P: AsRef<Path>>(
|
||||
path: P,
|
||||
) -> Result<Vec<FastTravelLocation>, XmlParseError> {
|
||||
parse_fast_travel_xml_internal(path, FastTravelType::Canoe)
|
||||
}
|
||||
|
||||
/// Parse FastTravelPortals.xml (portal fast travel locations)
|
||||
pub fn parse_fast_travel_portals_xml<P: AsRef<Path>>(
|
||||
path: P,
|
||||
) -> Result<Vec<FastTravelLocation>, XmlParseError> {
|
||||
parse_fast_travel_xml_internal(path, FastTravelType::Portal)
|
||||
}
|
||||
|
||||
/// Internal function to parse any fast travel XML file
|
||||
fn parse_fast_travel_xml_internal<P: AsRef<Path>>(
|
||||
path: P,
|
||||
travel_type: FastTravelType,
|
||||
) -> Result<Vec<FastTravelLocation>, XmlParseError> {
|
||||
let file = File::open(path)?;
|
||||
let buf_reader = BufReader::new(file);
|
||||
let mut reader = Reader::from_reader(buf_reader);
|
||||
reader.config_mut().trim_text(true);
|
||||
|
||||
let mut locations = Vec::new();
|
||||
let mut buf = Vec::new();
|
||||
|
||||
loop {
|
||||
match reader.read_event_into(&mut buf) {
|
||||
Ok(Event::Start(ref e)) | Ok(Event::Empty(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"location" => {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get required attributes
|
||||
let id = attrs
|
||||
.get("id")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("id".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("id".to_string()))?;
|
||||
|
||||
let name = attrs
|
||||
.get("name")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("name".to_string()))?
|
||||
.clone();
|
||||
|
||||
let position = attrs
|
||||
.get("pos")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("pos".to_string()))?
|
||||
.clone();
|
||||
|
||||
let mut location = FastTravelLocation::new(id, name, position, travel_type);
|
||||
|
||||
// Parse optional attributes based on type
|
||||
match travel_type {
|
||||
FastTravelType::Location => {
|
||||
// Regular locations have unlocked and connections
|
||||
if attrs.get("unlocked").is_some() {
|
||||
location.unlocked = true;
|
||||
}
|
||||
if let Some(v) = attrs.get("connections") {
|
||||
location.connections = Some(v.clone());
|
||||
}
|
||||
}
|
||||
FastTravelType::Canoe => {
|
||||
// Canoe locations have checks
|
||||
if let Some(v) = attrs.get("checks") {
|
||||
location.checks = Some(v.clone());
|
||||
}
|
||||
}
|
||||
FastTravelType::Portal => {
|
||||
// Portals have no additional fields
|
||||
}
|
||||
}
|
||||
|
||||
locations.push(location);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::Eof) => break,
|
||||
Err(e) => return Err(XmlParseError::XmlError(e)),
|
||||
_ => {}
|
||||
}
|
||||
buf.clear();
|
||||
}
|
||||
|
||||
Ok(locations)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Player House Parser
|
||||
// ============================================================================
|
||||
|
||||
pub fn parse_player_houses_xml<P: AsRef<Path>>(path: P) -> Result<Vec<PlayerHouse>, XmlParseError> {
|
||||
let file = File::open(path)?;
|
||||
let buf_reader = BufReader::new(file);
|
||||
let mut reader = Reader::from_reader(buf_reader);
|
||||
reader.config_mut().trim_text(true);
|
||||
|
||||
let mut houses = Vec::new();
|
||||
let mut buf = Vec::new();
|
||||
|
||||
loop {
|
||||
match reader.read_event_into(&mut buf) {
|
||||
Ok(Event::Start(ref e)) | Ok(Event::Empty(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"playerhouse" => {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get required attributes
|
||||
let id = attrs
|
||||
.get("id")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("id".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("id".to_string()))?;
|
||||
|
||||
let name = attrs
|
||||
.get("name")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("name".to_string()))?
|
||||
.clone();
|
||||
|
||||
let description = attrs
|
||||
.get("description")
|
||||
.unwrap_or(&String::new())
|
||||
.clone();
|
||||
|
||||
let position = attrs
|
||||
.get("pos")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("pos".to_string()))?
|
||||
.clone();
|
||||
|
||||
let price = attrs
|
||||
.get("price")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("price".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("price".to_string()))?;
|
||||
|
||||
let mut house = PlayerHouse::new(id, name, description, position, price);
|
||||
|
||||
// Parse optional attributes
|
||||
if attrs.get("hidden").is_some() {
|
||||
house.hidden = true;
|
||||
}
|
||||
|
||||
houses.push(house);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::Eof) => break,
|
||||
Err(e) => return Err(XmlParseError::XmlError(e)),
|
||||
_ => {}
|
||||
}
|
||||
buf.clear();
|
||||
}
|
||||
|
||||
Ok(houses)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Trait Parser
|
||||
// ============================================================================
|
||||
|
||||
pub fn parse_traits_xml<P: AsRef<Path>>(path: P) -> Result<Vec<Trait>, XmlParseError> {
|
||||
let file = File::open(path)?;
|
||||
let buf_reader = BufReader::new(file);
|
||||
let mut reader = Reader::from_reader(buf_reader);
|
||||
reader.config_mut().trim_text(true);
|
||||
|
||||
let mut traits = Vec::new();
|
||||
let mut buf = Vec::new();
|
||||
let mut current_trait: Option<Trait> = None;
|
||||
|
||||
loop {
|
||||
match reader.read_event_into(&mut buf) {
|
||||
Ok(Event::Start(ref e)) | Ok(Event::Empty(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"trait" => {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get required attributes
|
||||
let id = attrs
|
||||
.get("id")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("id".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("id".to_string()))?;
|
||||
|
||||
let name = attrs
|
||||
.get("name")
|
||||
.unwrap_or(&String::new())
|
||||
.clone();
|
||||
|
||||
let description = attrs
|
||||
.get("description")
|
||||
.unwrap_or(&String::new())
|
||||
.clone();
|
||||
|
||||
let mut trait_obj = Trait::new(id, name, description);
|
||||
|
||||
// Parse optional attributes
|
||||
if let Some(v) = attrs.get("learnability") {
|
||||
trait_obj.learnability = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("comment") {
|
||||
trait_obj.comment = Some(v.clone());
|
||||
}
|
||||
|
||||
current_trait = Some(trait_obj);
|
||||
}
|
||||
b"trainer" if current_trait.is_some() => {
|
||||
if let Some(ref mut trait_obj) = current_trait {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Parse trainer requirements
|
||||
if let (Some(skill), Some(level_str)) =
|
||||
(attrs.get("skill"), attrs.get("level"))
|
||||
{
|
||||
if let Ok(level) = level_str.parse::<i32>() {
|
||||
let mut trainer = TraitTrainer::new(skill.clone(), level);
|
||||
|
||||
// Parse optional tier icon
|
||||
if let Some(v) = attrs.get("tiericon") {
|
||||
trainer.tier_icon = v.parse().ok();
|
||||
}
|
||||
|
||||
trait_obj.trainer = Some(trainer);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::End(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"trait" => {
|
||||
if let Some(trait_obj) = current_trait.take() {
|
||||
traits.push(trait_obj);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::Eof) => break,
|
||||
Err(e) => return Err(XmlParseError::XmlError(e)),
|
||||
_ => {}
|
||||
}
|
||||
buf.clear();
|
||||
}
|
||||
|
||||
Ok(traits)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Shop Parser
|
||||
// ============================================================================
|
||||
|
||||
pub fn parse_shops_xml<P: AsRef<Path>>(path: P) -> Result<Vec<Shop>, XmlParseError> {
|
||||
let file = File::open(path)?;
|
||||
let buf_reader = BufReader::new(file);
|
||||
let mut reader = Reader::from_reader(buf_reader);
|
||||
reader.config_mut().trim_text(true);
|
||||
|
||||
let mut shops = Vec::new();
|
||||
let mut buf = Vec::new();
|
||||
let mut current_shop: Option<Shop> = None;
|
||||
|
||||
loop {
|
||||
match reader.read_event_into(&mut buf) {
|
||||
Ok(Event::Start(ref e)) | Ok(Event::Empty(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"shop" => {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get required attributes
|
||||
let shop_id = attrs
|
||||
.get("shopid")
|
||||
.ok_or_else(|| XmlParseError::MissingAttribute("shopid".to_string()))?
|
||||
.parse::<i32>()
|
||||
.map_err(|_| XmlParseError::InvalidAttribute("shopid".to_string()))?;
|
||||
|
||||
let name = attrs
|
||||
.get("name")
|
||||
.unwrap_or(&String::new())
|
||||
.clone();
|
||||
|
||||
let mut shop = Shop::new(shop_id, name);
|
||||
|
||||
// Parse optional attributes
|
||||
if attrs.get("isgeneralstore").is_some() {
|
||||
shop.is_general_store = true;
|
||||
}
|
||||
if let Some(v) = attrs.get("comment") {
|
||||
shop.comment = Some(v.clone());
|
||||
}
|
||||
|
||||
current_shop = Some(shop);
|
||||
}
|
||||
b"item" if current_shop.is_some() => {
|
||||
if let Some(ref mut shop) = current_shop {
|
||||
let attrs = parse_attributes(e)?;
|
||||
|
||||
// Get item ID (can be numeric or string)
|
||||
if let Some(item_id) = attrs.get("id") {
|
||||
let mut item = ShopItem::new(item_id.clone());
|
||||
|
||||
// Parse optional attributes
|
||||
if let Some(v) = attrs.get("name") {
|
||||
item.name = Some(v.clone());
|
||||
}
|
||||
if let Some(v) = attrs.get("price") {
|
||||
item.price = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("maxstock") {
|
||||
item.max_stock = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("restocktime") {
|
||||
item.restock_time = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("buyprice") {
|
||||
item.buy_price = v.parse().ok();
|
||||
}
|
||||
if let Some(v) = attrs.get("comment") {
|
||||
item.comment = Some(v.clone());
|
||||
}
|
||||
|
||||
shop.add_item(item);
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::End(ref e)) => {
|
||||
match e.name().as_ref() {
|
||||
b"shop" => {
|
||||
if let Some(shop) = current_shop.take() {
|
||||
shops.push(shop);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(Event::Eof) => break,
|
||||
Err(e) => return Err(XmlParseError::XmlError(e)),
|
||||
_ => {}
|
||||
}
|
||||
buf.clear();
|
||||
}
|
||||
|
||||
Ok(shops)
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user