15 Commits

Author SHA1 Message Date
fc13c465c0 Extract logic into reusable functions for better DRY 2025-12-30 11:21:44 -07:00
1ce3162fc0 Refactor Compiler struct to hold FunctionMetadata struct instead of flattening all that information directly onto the Compiler 2025-12-30 11:15:49 -07:00
3092e97d41 Minor DRY refactor. Added more tuple tests 2025-12-30 02:47:39 -07:00
8029fa82b0 complex tuple expressions supported 2025-12-30 02:38:32 -07:00
6d8a22459c wip 2025-12-30 02:31:21 -07:00
20f0f4b9a1 working tuple types 2025-12-30 00:58:02 -07:00
5a88befac9 tuple return types just about implemented 2025-12-30 00:32:55 -07:00
e94fc0f5de Functions returning tuples somewhat working, but they clobber the popped ra 2025-12-29 23:55:00 -07:00
b51800eb77 wip -- tuples compiling. need more work on function invocations 2025-12-29 23:17:18 -07:00
87951ab12f Support tuple assignment expressions and tuple assignments and declarations with function invocations 2025-12-29 22:33:16 -07:00
00b0d4df26 Create new tuple expression types 2025-12-29 22:17:19 -07:00
6ca53e8959 Merge pull request 'Added support for CRLF windows line endings' (#11) from 41-support-windows-crlf-line-endings into master
All checks were successful
CI/CD Pipeline / test (push) Successful in 33s
CI/CD Pipeline / build (push) Successful in 1m42s
CI/CD Pipeline / release (push) Successful in 5s
Reviewed-on: #11
2025-12-29 12:32:59 -07:00
8dfdad3f34 Added support for CRLF windows line endings
All checks were successful
CI/CD Pipeline / test (pull_request) Successful in 34s
CI/CD Pipeline / build (pull_request) Has been skipped
CI/CD Pipeline / release (pull_request) Has been skipped
2025-12-29 12:29:01 -07:00
e272737ea2 Merge pull request 'Fixed const -> let bug' (#10) from declaration_const_as_let into master
All checks were successful
CI/CD Pipeline / test (push) Successful in 32s
CI/CD Pipeline / build (push) Successful in 1m43s
CI/CD Pipeline / release (push) Successful in 5s
Reviewed-on: #10
2025-12-29 02:44:50 -07:00
f679601818 Fixed const -> let bug
All checks were successful
CI/CD Pipeline / test (pull_request) Successful in 33s
CI/CD Pipeline / build (pull_request) Has been skipped
CI/CD Pipeline / release (pull_request) Has been skipped
2025-12-29 02:31:23 -07:00
14 changed files with 2458 additions and 110 deletions

230
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,230 @@
# Slang Language Compiler - AI Agent Instructions
## Project Overview
**Slang** is a high-level programming language that compiles to IC10 assembly for the game Stationeers. The compiler is a multi-stage Rust system with a C# BepInEx mod integration layer.
**Key Goal:** Reduce manual IC10 assembly writing by providing C-like syntax with automatic register allocation and device abstraction.
## Architecture Overview
### Compilation Pipeline
The compiler follows a strict 4-stage pipeline (in [rust_compiler/libs/compiler/src/v1.rs](rust_compiler/libs/compiler/src/v1.rs)):
1. **Tokenizer** (libs/tokenizer/src/lib.rs) - Lexical analysis using `logos` crate
- Converts source text into tokens
- Tracks line/span information for error reporting
- Supports temperature literals (c/f/k suffixes)
2. **Parser** (libs/parser/src/lib.rs) - AST construction
- Recursive descent parser producing `Expression` tree
- Validates syntax, handles device declarations, function definitions
- Output: `Expression` enum containing tree nodes
3. **Compiler (v1)** (libs/compiler/src/v1.rs) - Semantic analysis & code generation
- Variable scope management and register allocation via `VariableManager`
- Emits IL instructions to `il::Instructions`
- Error types use `lsp_types::Diagnostic` for editor integration
4. **Optimizer** (libs/optimizer/src/lib.rs) - Post-generation optimization
- Currently optimizes leaf functions
- Optional pass before final output
### Cross-Language Integration
- **Rust Library** (`slang.dll`/`.so`): Core compiler logic via `safer-ffi` C FFI bindings
- **C# Mod** (`StationeersSlang.dll`): BepInEx plugin integrating with game UI
- **Generated Headers** (via `generate-headers` binary): Auto-generated C# bindings from Rust
### Key Types & Data Flow
- `Expression` tree (parser) → `v1::Compiler` processes → `il::Instructions` output
- `InstructionNode` wraps IC10 assembly with optional source span for debugging
- `VariableManager` tracks scopes, tracks const/device/let distinctions
- `Operand` enum represents register/literal/device-property values
## Critical Workflows
### Building
```bash
cd rust_compiler
# Build for both Linux and Windows targets
cargo build --release --target=x86_64-unknown-linux-gnu
cargo build --release --target=x86_64-pc-windows-gnu
# Generate C# FFI headers (requires "headers" feature)
cargo run --features headers --bin generate-headers
# Full build (run from root)
./build.sh
```
### Testing
```bash
cd rust_compiler
# Run all tests
cargo test --package compiler --lib
# Run specific test file
cargo test --package compiler --lib tuple_literals
# Run single test
cargo test --package compiler --lib -- test::tuple_literals::test::test_tuple_literal_size_mismatch --exact --nocapture
```
### Quick Compilation
```bash
cd rust_compiler
# Compile Slang code to IC10 using current compiler changes
echo 'let x = 5;' | cargo run --bin slang -
# Or from file
cargo run --bin slang -- input.slang -o output.ic10
# Optimize the output with -z flag
cargo run --bin slang -- input.slang -o output.ic10 -z
```
## Codebase Patterns
### Test Structure
Tests follow a macro pattern in [libs/compiler/src/test/mod.rs](rust_compiler/libs/compiler/src/test/mod.rs):
```rust
#[test]
fn test_name() -> Result<()> {
let output = compile!("slang code here");
assert_eq!(expected_ic10, output);
Ok(())
}
```
- `compile!()` macro: full pipeline from source to IC10
- `compile!(result ...)` for error checking
- `compile!(debug ...)` for intermediate IR inspection
- Test files organize by feature: `binary_expression.rs`, `syscall.rs`, `tuple_literals.rs`, etc.
### Error Handling
All stages return custom Error types implementing `From<lsp_types::Diagnostic>`:
- `tokenizer::Error` - Lexical errors
- `parser::Error<'a>` - Syntax errors
- `compiler::Error<'a>` - Semantic errors (unknown identifier, type mismatch)
- Device assignment prevention: `DeviceAssignment` error if reassigning device const
### Variable Scope Management
[variable_manager.rs](rust_compiler/libs/compiler/src/variable_manager.rs) handles:
- Tracking const vs mutable (let) distinction
- Device declarations as special scope items
- Function-local scopes with parameter handling
- Register allocation via `VariableLocation`
### LSP Integration
Error types implement conversion to `lsp_types::Diagnostic` for IDE feedback:
```rust
impl<'a> From<Error<'a>> for lsp_types::Diagnostic { ... }
```
This enables real-time error reporting in the Stationeers IC10 Editor mod.
## Project-Specific Conventions
### Tuple Destructuring
The compiler supports tuple returns and multi-assignment:
```rust
let (x, y) = func(); // TupleDeclarationExpression
(x, y) = another_func(); // TupleAssignmentExpression
```
Compiler validates size matching with `TupleSizeMismatch` error.
### Device Property Access
Devices are first-class with property access:
```rust
device ac = "d0";
ac.On = true;
ac.Temperature > 20c;
```
Parsed as `MemberAccessExpression`, compiled to device I/O syscalls.
### Temperature Literals
Unique language feature - automatic unit conversion at compile time:
```rust
20c 293.15k // Celsius to Kelvin
68f 293.15k // Fahrenheit to Kelvin
```
Tokenizer produces `Literal::Number(Number(decimal, Some(Unit::Celsius)))`.
### Constants are Immutable
Once declared with `const`, reassignment is a compile error. Device assignment prevention is critical (prevents game logic bugs).
## Integration Points
### C# FFI (`csharp_mod/FfiGlue.cs`)
- Calls Rust compiler via marshaled FFI
- Passes source code, receives IC10 output
- Marshals errors as `Diagnostic` objects
### BepInEx Plugin Lifecycle
[csharp_mod/Plugin.cs](csharp_mod/Plugin.cs):
- Harmony patches for IC10 Editor integration
- Cleanup code for live-reload support (mod destruction)
- Logger integration for debug output
### CI/Build Target Matrix
- Linux: `x86_64-unknown-linux-gnu`
- Windows: `x86_64-pc-windows-gnu` (cross-compile from Linux)
- Both produce dynamic libraries + CLI binary
## Debugging Tips
1. **Print source spans:** `Span` type tracks line/column for error reporting
2. **IL inspection:** Use `compile!(debug source)` to view intermediate instructions
3. **Register allocation:** `VariableManager` logs scope changes; check for conflicts
4. **Syscall validation:** [parser/src/sys_call.rs](rust_compiler/libs/parser/src/sys_call.rs) lists all valid syscalls
5. **Tokenizer issues:** Check [tokenizer/src/token.rs](rust_compiler/libs/tokenizer/src/token.rs) for supported keywords/symbols
## Key Files for Common Tasks
| Task | File |
| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- |
| Add language feature | [libs/parser/src/lib.rs](rust_compiler/libs/parser/src/lib.rs) + test in [libs/compiler/src/test/](rust_compiler/libs/compiler/src/test/) |
| Fix codegen bug | [libs/compiler/src/v1.rs](rust_compiler/libs/compiler/src/v1.rs) (~3500 lines) |
| Add syscall | [libs/parser/src/sys_call.rs](rust_compiler/libs/parser/src/sys_call.rs) |
| Optimize output | [libs/optimizer/src/lib.rs](rust_compiler/libs/optimizer/src/lib.rs) |
| Mod integration | [csharp_mod/](csharp_mod/) |
| Language docs | [docs/language-reference.md](docs/language-reference.md) |
## Dependencies to Know
- `logos` - Tokenizer with derive macros
- `rust_decimal` - Precise decimal arithmetic for temperature conversion
- `safer-ffi` - Safe C FFI between Rust and C#
- `lsp-types` - Standard for editor diagnostics
- `thiserror` - Error type derivation
- `clap` - CLI argument parsing
- `anyhow` - Error handling in main binary

View File

@@ -1,5 +1,14 @@
# Changelog # Changelog
[0.4.7]
- Added support for Windows CRLF endings
[0.4.6]
- Fixed bug in compiler where you were unable to assign a `const` value to
a `let` variable
[0.4.5] [0.4.5]
- Fixed issue where after clicking "Cancel" on the IC10 Editor, the side-by-side - Fixed issue where after clicking "Cancel" on the IC10 Editor, the side-by-side

View File

@@ -2,7 +2,7 @@
<ModMetadata xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <ModMetadata xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Name>Slang</Name> <Name>Slang</Name>
<Author>JoeDiertay</Author> <Author>JoeDiertay</Author>
<Version>0.4.5</Version> <Version>0.4.7</Version>
<Description> <Description>
[h1]Slang: High-Level Programming for Stationeers[/h1] [h1]Slang: High-Level Programming for Stationeers[/h1]

View File

@@ -39,7 +39,7 @@ namespace Slang
{ {
public const string PluginGuid = "com.biddydev.slang"; public const string PluginGuid = "com.biddydev.slang";
public const string PluginName = "Slang"; public const string PluginName = "Slang";
public const string PluginVersion = "0.4.5"; public const string PluginVersion = "0.4.7";
private static Harmony? _harmony; private static Harmony? _harmony;

View File

@@ -930,7 +930,7 @@ checksum = "e3a9fe34e3e7a50316060351f37187a3f546bce95496156754b601a5fa71b76e"
[[package]] [[package]]
name = "slang" name = "slang"
version = "0.4.5" version = "0.4.7"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"clap", "clap",

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "slang" name = "slang"
version = "0.4.5" version = "0.4.7"
edition = "2021" edition = "2021"
[workspace] [workspace]

View File

@@ -168,3 +168,28 @@ fn test_const_hash_expr() -> anyhow::Result<()> {
); );
Ok(()) Ok(())
} }
#[test]
fn test_declaration_is_const() -> anyhow::Result<()> {
let compiled = compile! {
debug
r#"
const MAX = 100;
let max = MAX;
"#
};
assert_eq!(
compiled,
indoc! {
"
j main
main:
move r8 100
"
}
);
Ok(())
}

View File

@@ -47,3 +47,4 @@ mod logic_expression;
mod loops; mod loops;
mod math_syscall; mod math_syscall;
mod syscall; mod syscall;
mod tuple_literals;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -441,7 +441,13 @@ impl<'a> Parser<'a> {
)); ));
} }
TokenType::Keyword(Keyword::Let) => Some(self.spanned(|p| p.declaration())?), TokenType::Keyword(Keyword::Let) => {
if self_matches_peek!(self, TokenType::Symbol(Symbol::LParen)) {
Some(self.spanned(|p| p.tuple_declaration())?)
} else {
Some(self.spanned(|p| p.declaration())?)
}
}
TokenType::Keyword(Keyword::Device) => { TokenType::Keyword(Keyword::Device) => {
let spanned_dev = self.spanned(|p| p.device())?; let spanned_dev = self.spanned(|p| p.device())?;
@@ -561,9 +567,7 @@ impl<'a> Parser<'a> {
}) })
} }
TokenType::Symbol(Symbol::LParen) => { TokenType::Symbol(Symbol::LParen) => self.parenthesized_or_tuple()?,
self.spanned(|p| p.priority())?.node.map(|node| *node)
}
TokenType::Symbol(Symbol::Minus) => { TokenType::Symbol(Symbol::Minus) => {
let start_span = self.current_span(); let start_span = self.current_span();
@@ -642,8 +646,8 @@ impl<'a> Parser<'a> {
} }
} }
TokenType::Symbol(Symbol::LParen) => *self TokenType::Symbol(Symbol::LParen) => *self
.spanned(|p| p.priority())? .parenthesized_or_tuple()?
.node .map(Box::new)
.ok_or(Error::UnexpectedEOF)?, .ok_or(Error::UnexpectedEOF)?,
TokenType::Identifier(ref id) if SysCall::is_syscall(id) => { TokenType::Identifier(ref id) if SysCall::is_syscall(id) => {
@@ -774,7 +778,8 @@ impl<'a> Parser<'a> {
| Expression::Ternary(_) | Expression::Ternary(_)
| Expression::Negation(_) | Expression::Negation(_)
| Expression::MemberAccess(_) | Expression::MemberAccess(_)
| Expression::MethodCall(_) => {} | Expression::MethodCall(_)
| Expression::Tuple(_) => {}
_ => { _ => {
return Err(Error::InvalidSyntax( return Err(Error::InvalidSyntax(
self.current_span(), self.current_span(),
@@ -1081,19 +1086,39 @@ impl<'a> Parser<'a> {
end_col: right.span.end_col, end_col: right.span.end_col,
}; };
expressions.insert( // Check if the left side is a tuple, and if so, create a TupleAssignment
i, let node = if let Expression::Tuple(tuple_expr) = &left.node {
Spanned { // Extract variable names from the tuple, handling underscores
let mut names = Vec::new();
for item in &tuple_expr.node {
if let Expression::Variable(var) = &item.node {
names.push(var.clone());
} else {
return Err(Error::InvalidSyntax(
item.span,
String::from("Tuple assignment can only contain variable names"),
));
}
}
Expression::TupleAssignment(Spanned {
span, span,
node: Expression::Assignment(Spanned { node: TupleAssignmentExpression {
span, names,
node: AssignmentExpression { value: boxed!(right),
assignee: boxed!(left), },
expression: boxed!(right), })
}, } else {
}), Expression::Assignment(Spanned {
}, span,
); node: AssignmentExpression {
assignee: boxed!(left),
expression: boxed!(right),
},
})
};
expressions.insert(i, Spanned { span, node });
} }
} }
operators.retain(|symbol| !matches!(symbol, Symbol::Assign)); operators.retain(|symbol| !matches!(symbol, Symbol::Assign));
@@ -1117,8 +1142,12 @@ impl<'a> Parser<'a> {
expressions.pop().ok_or(Error::UnexpectedEOF) expressions.pop().ok_or(Error::UnexpectedEOF)
} }
fn priority(&mut self) -> Result<Option<Box<Spanned<Expression<'a>>>>, Error<'a>> { fn parenthesized_or_tuple(
&mut self,
) -> Result<Option<Spanned<tree_node::Expression<'a>>>, Error<'a>> {
let start_span = self.current_span();
let current_token = self.current_token.as_ref().ok_or(Error::UnexpectedEOF)?; let current_token = self.current_token.as_ref().ok_or(Error::UnexpectedEOF)?;
if !token_matches!(current_token, TokenType::Symbol(Symbol::LParen)) { if !token_matches!(current_token, TokenType::Symbol(Symbol::LParen)) {
return Err(Error::UnexpectedToken( return Err(Error::UnexpectedToken(
self.current_span(), self.current_span(),
@@ -1127,17 +1156,112 @@ impl<'a> Parser<'a> {
} }
self.assign_next()?; self.assign_next()?;
let expression = self.expression()?.ok_or(Error::UnexpectedEOF)?;
let current_token = self.get_next()?.ok_or(Error::UnexpectedEOF)?; // Handle empty tuple '()'
if !token_matches!(current_token, TokenType::Symbol(Symbol::RParen)) { if self_matches_peek!(self, TokenType::Symbol(Symbol::RParen)) {
return Err(Error::UnexpectedToken( self.assign_next()?;
Self::token_to_span(&current_token), let end_span = self.current_span();
current_token, let span = Span {
)); start_line: start_span.start_line,
start_col: start_span.start_col,
end_line: end_span.end_line,
end_col: end_span.end_col,
};
return Ok(Some(Spanned {
span,
node: Expression::Tuple(Spanned { span, node: vec![] }),
}));
} }
Ok(Some(boxed!(expression))) let first_expression = self.expression()?.ok_or(Error::UnexpectedEOF)?;
if self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
// It is a tuple
let mut items = vec![first_expression];
while self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
// Next toekn is a comma, we need to consume it and advance 1 more time.
self.assign_next()?;
self.assign_next()?;
items.push(self.expression()?.ok_or(Error::UnexpectedEOF)?);
}
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
if !token_matches!(next, TokenType::Symbol(Symbol::RParen)) {
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
}
let end_span = Self::token_to_span(&next);
let span = Span {
start_line: start_span.start_line,
start_col: start_span.start_col,
end_line: end_span.end_line,
end_col: end_span.end_col,
};
Ok(Some(Spanned {
span,
node: Expression::Tuple(Spanned { span, node: items }),
}))
} else {
// It is just priority
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
if !token_matches!(next, TokenType::Symbol(Symbol::RParen)) {
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
}
Ok(Some(Spanned {
span: first_expression.span,
node: Expression::Priority(boxed!(first_expression)),
}))
}
}
fn tuple_declaration(&mut self) -> Result<Expression<'a>, Error<'a>> {
// 'let' is consumed before this call
// expect '('
let next = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
if !token_matches!(next, TokenType::Symbol(Symbol::LParen)) {
return Err(Error::UnexpectedToken(Self::token_to_span(&next), next));
}
let mut names = Vec::new();
while !self_matches_peek!(self, TokenType::Symbol(Symbol::RParen)) {
let token = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
let span = Self::token_to_span(&token);
if let TokenType::Identifier(id) = token.token_type {
names.push(Spanned { span, node: id });
} else {
return Err(Error::UnexpectedToken(span, token));
}
if self_matches_peek!(self, TokenType::Symbol(Symbol::Comma)) {
self.assign_next()?;
}
}
self.assign_next()?; // consume ')'
let assign = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
if !token_matches!(assign, TokenType::Symbol(Symbol::Assign)) {
return Err(Error::UnexpectedToken(Self::token_to_span(&assign), assign));
}
self.assign_next()?; // Consume the `=`
let value = self.expression()?.ok_or(Error::UnexpectedEOF)?;
let semi = self.get_next()?.ok_or(Error::UnexpectedEOF)?;
if !token_matches!(semi, TokenType::Symbol(Symbol::Semicolon)) {
return Err(Error::UnexpectedToken(Self::token_to_span(&semi), semi));
}
Ok(Expression::TupleDeclaration(Spanned {
span: names.first().map(|n| n.span).unwrap_or(value.span),
node: TupleDeclarationExpression {
names,
value: boxed!(value),
},
}))
} }
fn invocation(&mut self) -> Result<InvocationExpression<'a>, Error<'a>> { fn invocation(&mut self) -> Result<InvocationExpression<'a>, Error<'a>> {

View File

@@ -112,7 +112,7 @@ fn test_function_invocation() -> Result<()> {
#[test] #[test]
fn test_priority_expression() -> Result<()> { fn test_priority_expression() -> Result<()> {
let input = r#" let input = r#"
let x = (4); let x = (4 + 3);
"#; "#;
let tokenizer = Tokenizer::from(input); let tokenizer = Tokenizer::from(input);
@@ -120,7 +120,7 @@ fn test_priority_expression() -> Result<()> {
let expression = parser.parse()?.unwrap(); let expression = parser.parse()?.unwrap();
assert_eq!("(let x = 4)", expression.to_string()); assert_eq!("(let x = ((4 + 3)))", expression.to_string());
Ok(()) Ok(())
} }
@@ -137,7 +137,7 @@ fn test_binary_expression() -> Result<()> {
assert_eq!("(((45 * 2) - (15 / 5)) + (5 ** 2))", expr.to_string()); assert_eq!("(((45 * 2) - (15 / 5)) + (5 ** 2))", expr.to_string());
let expr = parser!("(5 - 2) * 10;").parse()?.unwrap(); let expr = parser!("(5 - 2) * 10;").parse()?.unwrap();
assert_eq!("((5 - 2) * 10)", expr.to_string()); assert_eq!("(((5 - 2)) * 10)", expr.to_string());
Ok(()) Ok(())
} }
@@ -170,7 +170,7 @@ fn test_ternary_expression() -> Result<()> {
fn test_complex_binary_with_ternary() -> Result<()> { fn test_complex_binary_with_ternary() -> Result<()> {
let expr = parser!("let i = (x ? 1 : 3) * 2;").parse()?.unwrap(); let expr = parser!("let i = (x ? 1 : 3) * 2;").parse()?.unwrap();
assert_eq!("(let i = ((x ? 1 : 3) * 2))", expr.to_string()); assert_eq!("(let i = (((x ? 1 : 3)) * 2))", expr.to_string());
Ok(()) Ok(())
} }
@@ -191,3 +191,99 @@ fn test_nested_ternary_right_associativity() -> Result<()> {
assert_eq!("(let i = (a ? b : (c ? d : e)))", expr.to_string()); assert_eq!("(let i = (a ? b : (c ? d : e)))", expr.to_string());
Ok(()) Ok(())
} }
#[test]
fn test_tuple_declaration() -> Result<()> {
let expr = parser!("let (x, _) = (1, 2);").parse()?.unwrap();
assert_eq!("(let (x, _) = (1, 2))", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_assignment() -> Result<()> {
let expr = parser!("(x, y) = (1, 2);").parse()?.unwrap();
assert_eq!("((x, y) = (1, 2))", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_assignment_with_underscore() -> Result<()> {
let expr = parser!("(x, _) = (1, 2);").parse()?.unwrap();
assert_eq!("((x, _) = (1, 2))", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_declaration_with_function_call() -> Result<()> {
let expr = parser!("let (x, y) = doSomething();").parse()?.unwrap();
assert_eq!("(let (x, y) = doSomething())", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_declaration_with_function_call_with_underscore() -> Result<()> {
let expr = parser!("let (x, _) = doSomething();").parse()?.unwrap();
assert_eq!("(let (x, _) = doSomething())", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_assignment_with_function_call() -> Result<()> {
let expr = parser!("(x, y) = doSomething();").parse()?.unwrap();
assert_eq!("((x, y) = doSomething())", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_assignment_with_function_call_with_underscore() -> Result<()> {
let expr = parser!("(x, _) = doSomething();").parse()?.unwrap();
assert_eq!("((x, _) = doSomething())", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_declaration_with_complex_expressions() -> Result<()> {
let expr = parser!("let (x, y) = (1 + 1, doSomething());")
.parse()?
.unwrap();
assert_eq!("(let (x, y) = ((1 + 1), doSomething()))", expr.to_string());
Ok(())
}
#[test]
fn test_tuple_assignment_with_complex_expressions() -> Result<()> {
let expr = parser!("(x, y) = (doSomething(), 123 / someValue.Setting);")
.parse()?
.unwrap();
assert_eq!(
"((x, y) = (doSomething(), (123 / someValue.Setting)))",
expr.to_string()
);
Ok(())
}
#[test]
fn test_tuple_declaration_all_complex_expressions() -> Result<()> {
let expr = parser!("let (x, y) = (a + b, c * d);").parse()?.unwrap();
assert_eq!("(let (x, y) = ((a + b), (c * d)))", expr.to_string());
Ok(())
}

View File

@@ -245,6 +245,42 @@ impl<'a> std::fmt::Display for DeviceDeclarationExpression<'a> {
} }
} }
#[derive(Debug, PartialEq, Eq)]
pub struct TupleDeclarationExpression<'a> {
pub names: Vec<Spanned<Cow<'a, str>>>,
pub value: Box<Spanned<Expression<'a>>>,
}
impl<'a> std::fmt::Display for TupleDeclarationExpression<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let names = self
.names
.iter()
.map(|n| n.node.to_string())
.collect::<Vec<_>>()
.join(", ");
write!(f, "(let ({}) = {})", names, self.value)
}
}
#[derive(Debug, PartialEq, Eq)]
pub struct TupleAssignmentExpression<'a> {
pub names: Vec<Spanned<Cow<'a, str>>>,
pub value: Box<Spanned<Expression<'a>>>,
}
impl<'a> std::fmt::Display for TupleAssignmentExpression<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let names = self
.names
.iter()
.map(|n| n.node.to_string())
.collect::<Vec<_>>()
.join(", ");
write!(f, "(({}) = {})", names, self.value)
}
}
#[derive(Debug, PartialEq, Eq)] #[derive(Debug, PartialEq, Eq)]
pub struct IfExpression<'a> { pub struct IfExpression<'a> {
pub condition: Box<Spanned<Expression<'a>>>, pub condition: Box<Spanned<Expression<'a>>>,
@@ -348,6 +384,9 @@ pub enum Expression<'a> {
Return(Option<Box<Spanned<Expression<'a>>>>), Return(Option<Box<Spanned<Expression<'a>>>>),
Syscall(Spanned<SysCall<'a>>), Syscall(Spanned<SysCall<'a>>),
Ternary(Spanned<TernaryExpression<'a>>), Ternary(Spanned<TernaryExpression<'a>>),
Tuple(Spanned<Vec<Spanned<Expression<'a>>>>),
TupleAssignment(Spanned<TupleAssignmentExpression<'a>>),
TupleDeclaration(Spanned<TupleDeclarationExpression<'a>>),
Variable(Spanned<Cow<'a, str>>), Variable(Spanned<Cow<'a, str>>),
While(Spanned<WhileExpression<'a>>), While(Spanned<WhileExpression<'a>>),
} }
@@ -384,8 +423,20 @@ impl<'a> std::fmt::Display for Expression<'a> {
), ),
Expression::Syscall(e) => write!(f, "{}", e), Expression::Syscall(e) => write!(f, "{}", e),
Expression::Ternary(e) => write!(f, "{}", e), Expression::Ternary(e) => write!(f, "{}", e),
Expression::Tuple(e) => {
let items = e
.node
.iter()
.map(|x| x.to_string())
.collect::<Vec<_>>()
.join(", ");
write!(f, "({})", items)
}
Expression::TupleAssignment(e) => write!(f, "{}", e),
Expression::TupleDeclaration(e) => write!(f, "{}", e),
Expression::Variable(id) => write!(f, "{}", id), Expression::Variable(id) => write!(f, "{}", id),
Expression::While(e) => write!(f, "{}", e), Expression::While(e) => write!(f, "{}", e),
} }
} }
} }

View File

@@ -115,7 +115,7 @@ macro_rules! keyword {
} }
#[derive(Debug, PartialEq, Hash, Eq, Clone, Logos)] #[derive(Debug, PartialEq, Hash, Eq, Clone, Logos)]
#[logos(skip r"[ \t\f]+")] #[logos(skip r"[ \r\t\f]+")]
#[logos(extras = Extras)] #[logos(extras = Extras)]
#[logos(error(LexError, LexError::from_lexer))] #[logos(error(LexError, LexError::from_lexer))]
pub enum TokenType<'a> { pub enum TokenType<'a> {
@@ -843,3 +843,20 @@ documented! {
} }
} }
#[cfg(test)]
mod tests {
use super::TokenType;
use logos::Logos;
#[test]
fn test_windows_crlf_endings() -> anyhow::Result<()> {
let src = "let i = 0;\r\n";
let lexer = TokenType::lexer(src);
let tokens = lexer.collect::<Vec<_>>();
assert!(!tokens.iter().any(|res| res.is_err()));
Ok(())
}
}