Compare commits
13 Commits
0.4.6
...
fc13c465c0
| Author | SHA1 | Date | |
|---|---|---|---|
|
fc13c465c0
|
|||
|
1ce3162fc0
|
|||
|
3092e97d41
|
|||
|
8029fa82b0
|
|||
|
6d8a22459c
|
|||
|
20f0f4b9a1
|
|||
|
5a88befac9
|
|||
|
e94fc0f5de
|
|||
|
b51800eb77
|
|||
|
87951ab12f
|
|||
|
00b0d4df26
|
|||
| 6ca53e8959 | |||
|
8dfdad3f34
|
230
.github/copilot-instructions.md
vendored
Normal file
230
.github/copilot-instructions.md
vendored
Normal file
@@ -0,0 +1,230 @@
|
||||
# Slang Language Compiler - AI Agent Instructions
|
||||
|
||||
## Project Overview
|
||||
|
||||
**Slang** is a high-level programming language that compiles to IC10 assembly for the game Stationeers. The compiler is a multi-stage Rust system with a C# BepInEx mod integration layer.
|
||||
|
||||
**Key Goal:** Reduce manual IC10 assembly writing by providing C-like syntax with automatic register allocation and device abstraction.
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Compilation Pipeline
|
||||
|
||||
The compiler follows a strict 4-stage pipeline (in [rust_compiler/libs/compiler/src/v1.rs](rust_compiler/libs/compiler/src/v1.rs)):
|
||||
|
||||
1. **Tokenizer** (libs/tokenizer/src/lib.rs) - Lexical analysis using `logos` crate
|
||||
|
||||
- Converts source text into tokens
|
||||
- Tracks line/span information for error reporting
|
||||
- Supports temperature literals (c/f/k suffixes)
|
||||
|
||||
2. **Parser** (libs/parser/src/lib.rs) - AST construction
|
||||
|
||||
- Recursive descent parser producing `Expression` tree
|
||||
- Validates syntax, handles device declarations, function definitions
|
||||
- Output: `Expression` enum containing tree nodes
|
||||
|
||||
3. **Compiler (v1)** (libs/compiler/src/v1.rs) - Semantic analysis & code generation
|
||||
|
||||
- Variable scope management and register allocation via `VariableManager`
|
||||
- Emits IL instructions to `il::Instructions`
|
||||
- Error types use `lsp_types::Diagnostic` for editor integration
|
||||
|
||||
4. **Optimizer** (libs/optimizer/src/lib.rs) - Post-generation optimization
|
||||
- Currently optimizes leaf functions
|
||||
- Optional pass before final output
|
||||
|
||||
### Cross-Language Integration
|
||||
|
||||
- **Rust Library** (`slang.dll`/`.so`): Core compiler logic via `safer-ffi` C FFI bindings
|
||||
- **C# Mod** (`StationeersSlang.dll`): BepInEx plugin integrating with game UI
|
||||
- **Generated Headers** (via `generate-headers` binary): Auto-generated C# bindings from Rust
|
||||
|
||||
### Key Types & Data Flow
|
||||
|
||||
- `Expression` tree (parser) → `v1::Compiler` processes → `il::Instructions` output
|
||||
- `InstructionNode` wraps IC10 assembly with optional source span for debugging
|
||||
- `VariableManager` tracks scopes, tracks const/device/let distinctions
|
||||
- `Operand` enum represents register/literal/device-property values
|
||||
|
||||
## Critical Workflows
|
||||
|
||||
### Building
|
||||
|
||||
```bash
|
||||
cd rust_compiler
|
||||
# Build for both Linux and Windows targets
|
||||
cargo build --release --target=x86_64-unknown-linux-gnu
|
||||
cargo build --release --target=x86_64-pc-windows-gnu
|
||||
|
||||
# Generate C# FFI headers (requires "headers" feature)
|
||||
cargo run --features headers --bin generate-headers
|
||||
|
||||
# Full build (run from root)
|
||||
./build.sh
|
||||
```
|
||||
|
||||
### Testing
|
||||
|
||||
```bash
|
||||
cd rust_compiler
|
||||
# Run all tests
|
||||
cargo test --package compiler --lib
|
||||
|
||||
# Run specific test file
|
||||
cargo test --package compiler --lib tuple_literals
|
||||
|
||||
# Run single test
|
||||
cargo test --package compiler --lib -- test::tuple_literals::test::test_tuple_literal_size_mismatch --exact --nocapture
|
||||
```
|
||||
|
||||
### Quick Compilation
|
||||
|
||||
```bash
|
||||
cd rust_compiler
|
||||
# Compile Slang code to IC10 using current compiler changes
|
||||
echo 'let x = 5;' | cargo run --bin slang -
|
||||
# Or from file
|
||||
cargo run --bin slang -- input.slang -o output.ic10
|
||||
# Optimize the output with -z flag
|
||||
cargo run --bin slang -- input.slang -o output.ic10 -z
|
||||
```
|
||||
|
||||
## Codebase Patterns
|
||||
|
||||
### Test Structure
|
||||
|
||||
Tests follow a macro pattern in [libs/compiler/src/test/mod.rs](rust_compiler/libs/compiler/src/test/mod.rs):
|
||||
|
||||
```rust
|
||||
#[test]
|
||||
fn test_name() -> Result<()> {
|
||||
let output = compile!("slang code here");
|
||||
assert_eq!(expected_ic10, output);
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
- `compile!()` macro: full pipeline from source to IC10
|
||||
- `compile!(result ...)` for error checking
|
||||
- `compile!(debug ...)` for intermediate IR inspection
|
||||
- Test files organize by feature: `binary_expression.rs`, `syscall.rs`, `tuple_literals.rs`, etc.
|
||||
|
||||
### Error Handling
|
||||
|
||||
All stages return custom Error types implementing `From<lsp_types::Diagnostic>`:
|
||||
|
||||
- `tokenizer::Error` - Lexical errors
|
||||
- `parser::Error<'a>` - Syntax errors
|
||||
- `compiler::Error<'a>` - Semantic errors (unknown identifier, type mismatch)
|
||||
- Device assignment prevention: `DeviceAssignment` error if reassigning device const
|
||||
|
||||
### Variable Scope Management
|
||||
|
||||
[variable_manager.rs](rust_compiler/libs/compiler/src/variable_manager.rs) handles:
|
||||
|
||||
- Tracking const vs mutable (let) distinction
|
||||
- Device declarations as special scope items
|
||||
- Function-local scopes with parameter handling
|
||||
- Register allocation via `VariableLocation`
|
||||
|
||||
### LSP Integration
|
||||
|
||||
Error types implement conversion to `lsp_types::Diagnostic` for IDE feedback:
|
||||
|
||||
```rust
|
||||
impl<'a> From<Error<'a>> for lsp_types::Diagnostic { ... }
|
||||
```
|
||||
|
||||
This enables real-time error reporting in the Stationeers IC10 Editor mod.
|
||||
|
||||
## Project-Specific Conventions
|
||||
|
||||
### Tuple Destructuring
|
||||
|
||||
The compiler supports tuple returns and multi-assignment:
|
||||
|
||||
```rust
|
||||
let (x, y) = func(); // TupleDeclarationExpression
|
||||
(x, y) = another_func(); // TupleAssignmentExpression
|
||||
```
|
||||
|
||||
Compiler validates size matching with `TupleSizeMismatch` error.
|
||||
|
||||
### Device Property Access
|
||||
|
||||
Devices are first-class with property access:
|
||||
|
||||
```rust
|
||||
device ac = "d0";
|
||||
ac.On = true;
|
||||
ac.Temperature > 20c;
|
||||
```
|
||||
|
||||
Parsed as `MemberAccessExpression`, compiled to device I/O syscalls.
|
||||
|
||||
### Temperature Literals
|
||||
|
||||
Unique language feature - automatic unit conversion at compile time:
|
||||
|
||||
```rust
|
||||
20c → 293.15k // Celsius to Kelvin
|
||||
68f → 293.15k // Fahrenheit to Kelvin
|
||||
```
|
||||
|
||||
Tokenizer produces `Literal::Number(Number(decimal, Some(Unit::Celsius)))`.
|
||||
|
||||
### Constants are Immutable
|
||||
|
||||
Once declared with `const`, reassignment is a compile error. Device assignment prevention is critical (prevents game logic bugs).
|
||||
|
||||
## Integration Points
|
||||
|
||||
### C# FFI (`csharp_mod/FfiGlue.cs`)
|
||||
|
||||
- Calls Rust compiler via marshaled FFI
|
||||
- Passes source code, receives IC10 output
|
||||
- Marshals errors as `Diagnostic` objects
|
||||
|
||||
### BepInEx Plugin Lifecycle
|
||||
|
||||
[csharp_mod/Plugin.cs](csharp_mod/Plugin.cs):
|
||||
|
||||
- Harmony patches for IC10 Editor integration
|
||||
- Cleanup code for live-reload support (mod destruction)
|
||||
- Logger integration for debug output
|
||||
|
||||
### CI/Build Target Matrix
|
||||
|
||||
- Linux: `x86_64-unknown-linux-gnu`
|
||||
- Windows: `x86_64-pc-windows-gnu` (cross-compile from Linux)
|
||||
- Both produce dynamic libraries + CLI binary
|
||||
|
||||
## Debugging Tips
|
||||
|
||||
1. **Print source spans:** `Span` type tracks line/column for error reporting
|
||||
2. **IL inspection:** Use `compile!(debug source)` to view intermediate instructions
|
||||
3. **Register allocation:** `VariableManager` logs scope changes; check for conflicts
|
||||
4. **Syscall validation:** [parser/src/sys_call.rs](rust_compiler/libs/parser/src/sys_call.rs) lists all valid syscalls
|
||||
5. **Tokenizer issues:** Check [tokenizer/src/token.rs](rust_compiler/libs/tokenizer/src/token.rs) for supported keywords/symbols
|
||||
|
||||
## Key Files for Common Tasks
|
||||
|
||||
| Task | File |
|
||||
| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| Add language feature | [libs/parser/src/lib.rs](rust_compiler/libs/parser/src/lib.rs) + test in [libs/compiler/src/test/](rust_compiler/libs/compiler/src/test/) |
|
||||
| Fix codegen bug | [libs/compiler/src/v1.rs](rust_compiler/libs/compiler/src/v1.rs) (~3500 lines) |
|
||||
| Add syscall | [libs/parser/src/sys_call.rs](rust_compiler/libs/parser/src/sys_call.rs) |
|
||||
| Optimize output | [libs/optimizer/src/lib.rs](rust_compiler/libs/optimizer/src/lib.rs) |
|
||||
| Mod integration | [csharp_mod/](csharp_mod/) |
|
||||
| Language docs | [docs/language-reference.md](docs/language-reference.md) |
|
||||
|
||||
## Dependencies to Know
|
||||
|
||||
- `logos` - Tokenizer with derive macros
|
||||
- `rust_decimal` - Precise decimal arithmetic for temperature conversion
|
||||
- `safer-ffi` - Safe C FFI between Rust and C#
|
||||
- `lsp-types` - Standard for editor diagnostics
|
||||
- `thiserror` - Error type derivation
|
||||
- `clap` - CLI argument parsing
|
||||
- `anyhow` - Error handling in main binary
|
||||
@@ -1,5 +1,9 @@
|
||||
# Changelog
|
||||
|
||||
[0.4.7]
|
||||
|
||||
- Added support for Windows CRLF endings
|
||||
|
||||
[0.4.6]
|
||||
|
||||
- Fixed bug in compiler where you were unable to assign a `const` value to
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
<ModMetadata xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<Name>Slang</Name>
|
||||
<Author>JoeDiertay</Author>
|
||||
<Version>0.4.6</Version>
|
||||
<Version>0.4.7</Version>
|
||||
<Description>
|
||||
[h1]Slang: High-Level Programming for Stationeers[/h1]
|
||||
|
||||
|
||||
@@ -39,7 +39,7 @@ namespace Slang
|
||||
{
|
||||
public const string PluginGuid = "com.biddydev.slang";
|
||||
public const string PluginName = "Slang";
|
||||
public const string PluginVersion = "0.4.6";
|
||||
public const string PluginVersion = "0.4.7";
|
||||
|
||||
private static Harmony? _harmony;
|
||||
|
||||
|
||||
2
rust_compiler/Cargo.lock
generated
2
rust_compiler/Cargo.lock
generated
@@ -930,7 +930,7 @@ checksum = "e3a9fe34e3e7a50316060351f37187a3f546bce95496156754b601a5fa71b76e"
|
||||
|
||||
[[package]]
|
||||
name = "slang"
|
||||
version = "0.4.6"
|
||||
version = "0.4.7"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "slang"
|
||||
version = "0.4.6"
|
||||
version = "0.4.7"
|
||||
edition = "2021"
|
||||
|
||||
[workspace]
|
||||
|
||||
@@ -47,3 +47,4 @@ mod logic_expression;
|
||||
mod loops;
|
||||
mod math_syscall;
|
||||
mod syscall;
|
||||
mod tuple_literals;
|
||||
|
||||
1009
rust_compiler/libs/compiler/src/test/tuple_literals.rs
Normal file
1009
rust_compiler/libs/compiler/src/test/tuple_literals.rs
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -441,7 +441,13 @@ impl<'a> Parser<'a> {
|
||||
));
|
||||
}
|
||||
|
||||
TokenType::Keyword(Keyword::Let) => Some(self.spanned(|p| p.declaration())?),
|
||||
TokenType::Keyword(Keyword::Let) => {
|
||||
if self_matches_peek!(self, TokenType::Symbol(Symbol::LParen)) {
|
||||
Some(self.spanned(|p| p.tuple_declaration())?)
|
||||
} else {
|
||||
Some(self.spanned(|p| p.declaration())?)
|
||||
}
|
||||
}
|
||||
|
||||
TokenType::Keyword(Keyword::Device) => {
|
||||
let spanned_dev = self.spanned(|p| p.device())?;
|
||||
@@ -561,9 +567,7 @@ impl<'a> Parser<'a> {
|
||||
})
|
||||
}
|
||||
|
||||
TokenType::Symbol(Symbol::LParen) => {
|
||||
self.spanned(|p| p.priority())?.node.map(|node| *node)
|
||||
}
|
||||
TokenType::Symbol(Symbol::LParen) => self.parenthesized_or_tuple()?,
|
||||
|
||||
TokenType::Symbol(Symbol::Minus) => {
|
||||
let start_span = self.current_span();
|
||||
@@ -642,8 +646,8 @@ impl<'a> Parser<'a> {
|
||||
}
|
||||
}
|
||||
TokenType::Symbol(Symbol::LParen) => *self
|
||||
.spanned(|p| p.priority())?
|
||||
.node
|
||||
.parenthesized_or_tuple()?
|
||||
.map(Box::new)
|
||||
.ok_or(Error::UnexpectedEOF)?,
|
||||
|
||||
TokenType::Identifier(ref id) if SysCall::is_syscall(id) => {
|
||||
@@ -774,7 +778,8 @@ impl<'a> Parser<'a> {
|
||||
| Expression::Ternary(_)
|
||||
| Expression::Negation(_)
|
||||
| Expression::MemberAccess(_)
|
||||
| Expression::MethodCall(_) => {}
|
||||
| Expression::MethodCall(_)
|
||||
| Expression::Tuple(_) => {}
|
||||
_ => {
|
||||
return Err(Error::InvalidSyntax(
|
||||
self.current_span(),
|
||||
@@ -1081,19 +1086,39 @@ impl<'a> Parser<'a> {
|
||||
end_col: right.span.end_col,
|
||||
};
|
||||
|
||||
expressions.insert(
|
||||
i,
|
||||
Spanned {
|
||||
// Check if the left side is a tuple, and if so, create a TupleAssignment
|
||||
let node = if let Expression::Tuple(tuple_expr) = &left.node {
|
||||
// Extract variable names from the tuple, handling underscores
|
||||
let mut names = Vec::new();
|
||||
for item in &tuple_expr.node {
|
||||
if let Expression::Variable(var) = &item.node {
|
||||
names.push(var.clone());
|
||||
} else {
|
||||
return Err(Error::InvalidSyntax(
|
||||
item.span,
|
||||
String::from("Tuple assignment can only contain variable names"),
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Expression::TupleAssignment(Spanned {
|
||||
span,
|
||||
node: Expression::Assignment(Spanned {
|
||||
span,
|
||||
node: AssignmentExpression {
|
||||
assignee: boxed!(left),
|
||||
expression: boxed!(right),
|
||||
},
|
||||
}),
|
||||
},
|
||||
);
|
||||
node: TupleAssignmentExpression {
|
||||
names,
|
||||
value: boxed!(right),
|
||||
},
|
||||
})
|
||||
} else {
|
||||
Expression::Assignment(Spanned {
|
||||
span,
|
||||
node: AssignmentExpression {
|
||||
assignee: boxed!(left),
|
||||
expression: boxed!(right),
|
||||
},
|
||||
})
|
||||
};
|
||||
|
||||
expressions.insert(i, Spanned { span, node });
|
||||
}
|
||||
}
|
||||
operators.retain(|symbol| !matches!(symbol, Symbol::Assign));
|
||||
@@ -1117,8 +1142,12 @@ impl<'a> Parser<'a> {
|
||||
expressions.pop().ok_or(Error::UnexpectedEOF)
|
||||
}
|
||||
|
||||
fn priority(&mut self) -> Result<Option<Box<Spanned<Expression<'a>>>>, Error<'a>> {
|
||||
fn parenthesized_or_tuple(
|
||||
&mut self,
|
||||
) -> Result<Option<Spanned<tree_node::Expression<'a>>>, Error<'a>> {
|
||||
let start_span = self.current_span();
|
||||
let current_token = self.current_token.as_ref().ok_or(Error::UnexpectedEOF)?;
|
||||
|
||||
if !token_matches!(current_token, TokenType::Symbol(Symbol::LParen)) {
|
||||
return Err(Error::UnexpectedToken(
|
||||
self.current_span(),
|
||||
@@ -1127,17 +1156,112 @@ impl<'a> Parser<'a> {
|
||||
}
|
||||
|
||||
self.assign_next()?;
|
||||
let expression = self.expression()?.ok_or(Error::UnexpectedEOF)?;
|
||||
|
||||
let current_token = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
if !token_matches!(current_token, TokenType::Symbol(Symbol::RParen)) {
|
||||
return Err(Error::UnexpectedToken(
|
||||
Self::token_to_span(¤t_token),
|
||||
current_token,
|
||||
));
|
||||
// Handle empty tuple '()'
|
||||
if self_matches_peek!(self, TokenType::Symbol(Symbol::RParen)) {
|
||||
self.assign_next()?;
|
||||
let end_span = self.current_span();
|
||||
let span = Span {
|
||||
start_line: start_span.start_line,
|
||||
start_col: start_span.start_col,
|
||||
end_line: end_span.end_line,
|
||||
end_col: end_span.end_col,
|
||||
};
|
||||
return Ok(Some(Spanned {
|
||||
span,
|
||||
node: Expression::Tuple(Spanned { span, node: vec![] }),
|
||||
}));
|
||||
}
|
||||
|
||||
Ok(Some(boxed!(expression)))
|
||||
let first_expression = self.expression()?.ok_or(Error::UnexpectedEOF)?;
|
||||
|
||||
if self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
|
||||
// It is a tuple
|
||||
let mut items = vec![first_expression];
|
||||
while self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
|
||||
// Next toekn is a comma, we need to consume it and advance 1 more time.
|
||||
self.assign_next()?;
|
||||
self.assign_next()?;
|
||||
items.push(self.expression()?.ok_or(Error::UnexpectedEOF)?);
|
||||
}
|
||||
|
||||
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
if !token_matches!(next, TokenType::Symbol(Symbol::RParen)) {
|
||||
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
|
||||
}
|
||||
|
||||
let end_span = Self::token_to_span(&next);
|
||||
let span = Span {
|
||||
start_line: start_span.start_line,
|
||||
start_col: start_span.start_col,
|
||||
end_line: end_span.end_line,
|
||||
end_col: end_span.end_col,
|
||||
};
|
||||
|
||||
Ok(Some(Spanned {
|
||||
span,
|
||||
node: Expression::Tuple(Spanned { span, node: items }),
|
||||
}))
|
||||
} else {
|
||||
// It is just priority
|
||||
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
if !token_matches!(next, TokenType::Symbol(Symbol::RParen)) {
|
||||
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
|
||||
}
|
||||
|
||||
Ok(Some(Spanned {
|
||||
span: first_expression.span,
|
||||
node: Expression::Priority(boxed!(first_expression)),
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
fn tuple_declaration(&mut self) -> Result<Expression<'a>, Error<'a>> {
|
||||
// 'let' is consumed before this call
|
||||
// expect '('
|
||||
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
if !token_matches!(next, TokenType::Symbol(Symbol::LParen)) {
|
||||
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
|
||||
}
|
||||
|
||||
let mut names = Vec::new();
|
||||
while !self_matches_peek!(self, TokenType::Symbol(Symbol::RParen)) {
|
||||
let token = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
let span = Self::token_to_span(&token);
|
||||
if let TokenType::Identifier(id) = token.token_type {
|
||||
names.push(Spanned { span, node: id });
|
||||
} else {
|
||||
return Err(Error::UnexpectedToken(span, token));
|
||||
}
|
||||
|
||||
if self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
|
||||
self.assign_next()?;
|
||||
}
|
||||
}
|
||||
self.assign_next()?; // consume ')'
|
||||
|
||||
let assign = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
|
||||
if !token_matches!(assign, TokenType::Symbol(Symbol::Assign)) {
|
||||
return Err(Error::UnexpectedToken(Self::token_to_span(&assign), assign));
|
||||
}
|
||||
|
||||
self.assign_next()?; // Consume the `=`
|
||||
|
||||
let value = self.expression()?.ok_or(Error::UnexpectedEOF)?;
|
||||
|
||||
let semi = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
|
||||
if !token_matches!(semi, TokenType::Symbol(Symbol::Semicolon)) {
|
||||
return Err(Error::UnexpectedToken(Self::token_to_span(&semi), semi));
|
||||
}
|
||||
|
||||
Ok(Expression::TupleDeclaration(Spanned {
|
||||
span: names.first().map(|n| n.span).unwrap_or(value.span),
|
||||
node: TupleDeclarationExpression {
|
||||
names,
|
||||
value: boxed!(value),
|
||||
},
|
||||
}))
|
||||
}
|
||||
|
||||
fn invocation(&mut self) -> Result<InvocationExpression<'a>, Error<'a>> {
|
||||
|
||||
@@ -112,7 +112,7 @@ fn test_function_invocation() -> Result<()> {
|
||||
#[test]
|
||||
fn test_priority_expression() -> Result<()> {
|
||||
let input = r#"
|
||||
let x = (4);
|
||||
let x = (4 + 3);
|
||||
"#;
|
||||
|
||||
let tokenizer = Tokenizer::from(input);
|
||||
@@ -120,7 +120,7 @@ fn test_priority_expression() -> Result<()> {
|
||||
|
||||
let expression = parser.parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let x = 4)", expression.to_string());
|
||||
assert_eq!("(let x = ((4 + 3)))", expression.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -137,7 +137,7 @@ fn test_binary_expression() -> Result<()> {
|
||||
assert_eq!("(((45 * 2) - (15 / 5)) + (5 ** 2))", expr.to_string());
|
||||
|
||||
let expr = parser!("(5 - 2) * 10;").parse()?.unwrap();
|
||||
assert_eq!("((5 - 2) * 10)", expr.to_string());
|
||||
assert_eq!("(((5 - 2)) * 10)", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -170,7 +170,7 @@ fn test_ternary_expression() -> Result<()> {
|
||||
fn test_complex_binary_with_ternary() -> Result<()> {
|
||||
let expr = parser!("let i = (x ? 1 : 3) * 2;").parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let i = ((x ? 1 : 3) * 2))", expr.to_string());
|
||||
assert_eq!("(let i = (((x ? 1 : 3)) * 2))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -191,3 +191,99 @@ fn test_nested_ternary_right_associativity() -> Result<()> {
|
||||
assert_eq!("(let i = (a ? b : (c ? d : e)))", expr.to_string());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_declaration() -> Result<()> {
|
||||
let expr = parser!("let (x, _) = (1, 2);").parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let (x, _) = (1, 2))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
#[test]
|
||||
fn test_tuple_assignment() -> Result<()> {
|
||||
let expr = parser!("(x, y) = (1, 2);").parse()?.unwrap();
|
||||
|
||||
assert_eq!("((x, y) = (1, 2))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_assignment_with_underscore() -> Result<()> {
|
||||
let expr = parser!("(x, _) = (1, 2);").parse()?.unwrap();
|
||||
|
||||
assert_eq!("((x, _) = (1, 2))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_declaration_with_function_call() -> Result<()> {
|
||||
let expr = parser!("let (x, y) = doSomething();").parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let (x, y) = doSomething())", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_declaration_with_function_call_with_underscore() -> Result<()> {
|
||||
let expr = parser!("let (x, _) = doSomething();").parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let (x, _) = doSomething())", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_assignment_with_function_call() -> Result<()> {
|
||||
let expr = parser!("(x, y) = doSomething();").parse()?.unwrap();
|
||||
|
||||
assert_eq!("((x, y) = doSomething())", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_assignment_with_function_call_with_underscore() -> Result<()> {
|
||||
let expr = parser!("(x, _) = doSomething();").parse()?.unwrap();
|
||||
|
||||
assert_eq!("((x, _) = doSomething())", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_declaration_with_complex_expressions() -> Result<()> {
|
||||
let expr = parser!("let (x, y) = (1 + 1, doSomething());")
|
||||
.parse()?
|
||||
.unwrap();
|
||||
|
||||
assert_eq!("(let (x, y) = ((1 + 1), doSomething()))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_assignment_with_complex_expressions() -> Result<()> {
|
||||
let expr = parser!("(x, y) = (doSomething(), 123 / someValue.Setting);")
|
||||
.parse()?
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(
|
||||
"((x, y) = (doSomething(), (123 / someValue.Setting)))",
|
||||
expr.to_string()
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tuple_declaration_all_complex_expressions() -> Result<()> {
|
||||
let expr = parser!("let (x, y) = (a + b, c * d);").parse()?.unwrap();
|
||||
|
||||
assert_eq!("(let (x, y) = ((a + b), (c * d)))", expr.to_string());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -245,6 +245,42 @@ impl<'a> std::fmt::Display for DeviceDeclarationExpression<'a> {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub struct TupleDeclarationExpression<'a> {
|
||||
pub names: Vec<Spanned<Cow<'a, str>>>,
|
||||
pub value: Box<Spanned<Expression<'a>>>,
|
||||
}
|
||||
|
||||
impl<'a> std::fmt::Display for TupleDeclarationExpression<'a> {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
let names = self
|
||||
.names
|
||||
.iter()
|
||||
.map(|n| n.node.to_string())
|
||||
.collect::<Vec<_>>()
|
||||
.join(", ");
|
||||
write!(f, "(let ({}) = {})", names, self.value)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub struct TupleAssignmentExpression<'a> {
|
||||
pub names: Vec<Spanned<Cow<'a, str>>>,
|
||||
pub value: Box<Spanned<Expression<'a>>>,
|
||||
}
|
||||
|
||||
impl<'a> std::fmt::Display for TupleAssignmentExpression<'a> {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
let names = self
|
||||
.names
|
||||
.iter()
|
||||
.map(|n| n.node.to_string())
|
||||
.collect::<Vec<_>>()
|
||||
.join(", ");
|
||||
write!(f, "(({}) = {})", names, self.value)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub struct IfExpression<'a> {
|
||||
pub condition: Box<Spanned<Expression<'a>>>,
|
||||
@@ -348,6 +384,9 @@ pub enum Expression<'a> {
|
||||
Return(Option<Box<Spanned<Expression<'a>>>>),
|
||||
Syscall(Spanned<SysCall<'a>>),
|
||||
Ternary(Spanned<TernaryExpression<'a>>),
|
||||
Tuple(Spanned<Vec<Spanned<Expression<'a>>>>),
|
||||
TupleAssignment(Spanned<TupleAssignmentExpression<'a>>),
|
||||
TupleDeclaration(Spanned<TupleDeclarationExpression<'a>>),
|
||||
Variable(Spanned<Cow<'a, str>>),
|
||||
While(Spanned<WhileExpression<'a>>),
|
||||
}
|
||||
@@ -384,8 +423,20 @@ impl<'a> std::fmt::Display for Expression<'a> {
|
||||
),
|
||||
Expression::Syscall(e) => write!(f, "{}", e),
|
||||
Expression::Ternary(e) => write!(f, "{}", e),
|
||||
Expression::Tuple(e) => {
|
||||
let items = e
|
||||
.node
|
||||
.iter()
|
||||
.map(|x| x.to_string())
|
||||
.collect::<Vec<_>>()
|
||||
.join(", ");
|
||||
write!(f, "({})", items)
|
||||
}
|
||||
Expression::TupleAssignment(e) => write!(f, "{}", e),
|
||||
Expression::TupleDeclaration(e) => write!(f, "{}", e),
|
||||
Expression::Variable(id) => write!(f, "{}", id),
|
||||
Expression::While(e) => write!(f, "{}", e),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -115,7 +115,7 @@ macro_rules! keyword {
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Hash, Eq, Clone, Logos)]
|
||||
#[logos(skip r"[ \t\f]+")]
|
||||
#[logos(skip r"[ \r\t\f]+")]
|
||||
#[logos(extras = Extras)]
|
||||
#[logos(error(LexError, LexError::from_lexer))]
|
||||
pub enum TokenType<'a> {
|
||||
@@ -843,3 +843,20 @@ documented! {
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::TokenType;
|
||||
use logos::Logos;
|
||||
|
||||
#[test]
|
||||
fn test_windows_crlf_endings() -> anyhow::Result<()> {
|
||||
let src = "let i = 0;\r\n";
|
||||
|
||||
let lexer = TokenType::lexer(src);
|
||||
|
||||
let tokens = lexer.collect::<Vec<_>>();
|
||||
|
||||
assert!(!tokens.iter().any(|res| res.is_err()));
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user