Compare commits

...

1634 Commits

Author SHA1 Message Date
greg b23b8ebdc7 Infrastructure for separate-object block handler
Requires one unstable feature
2019-12-15 11:56:49 -08:00
greg 8d3bbd7069 Fix annotation 2019-12-13 18:40:14 -08:00
greg afcb10bb72 Add random idea 2019-11-18 03:11:00 -08:00
greg 8de625e540 Got rid of symbol table from eval 2019-11-10 03:28:31 -08:00
greg a2bd9a3985 Remove symbol table from evaluator 2019-11-09 19:52:05 -08:00
greg e4a1a23f4d Moved sym lookup logic from eval to ast reducer 2019-11-09 19:49:02 -08:00
greg 2cd325ba12 Add plan of attack notes 2019-11-08 18:56:15 -08:00
greg 8218007f1c Commit this temporary fix 2019-11-08 18:53:38 -08:00
greg 040ab11873 Move reduction of values into separate method 2019-11-07 03:28:18 -08:00
greg b967fa1911 to_repl() doesn't need symbol table handle 2019-11-07 02:42:17 -08:00
greg 4c718ed977 Add TODO for symbol resolver 2019-11-06 18:41:37 -08:00
greg d20acf7166 Add tokenization for string literal prefixes 2019-11-05 02:22:11 -08:00
greg efc8497235 Rearchitect parser
To ensure that the prelude gets parsed with the same ItemId context as
normal REPL input
2019-10-25 01:49:15 -07:00
greg d824b8d6ef Idea for pattern matching 2019-10-24 03:09:17 -07:00
greg 4a1987b5a2 Test for modules in symbol table 2019-10-24 03:02:52 -07:00
greg c96644ddce Modules in symbol table 2019-10-24 02:13:07 -07:00
greg cc0ac83709 Refactor a lot of symbol table in prep for modules 2019-10-24 01:34:13 -07:00
greg d6019e6f9a Improve REPL help message
Show help strings for children of a directive
2019-10-23 21:41:25 -07:00
greg 3344f6827d Clear out some compiler warnings 2019-10-23 16:07:10 -07:00
greg b38c4b3298 SymbolTable passing, fix test for duplicate line 2019-10-23 14:47:18 -07:00
greg a2f30b6136 Refactored symbol_table test 2019-10-23 14:47:18 -07:00
greg 11a9a60a34 Rejiggering some things with the SourceMap pointer in Parser 2019-10-23 14:47:18 -07:00
greg 5bb1a245c4 Have Parser accept SourceMap reference 2019-10-23 14:47:18 -07:00
greg 1ffe61cf5f Partway there in terms of implementing source map lookup 2019-10-23 14:47:18 -07:00
greg 7495f30e16 Pass SourceMapHandle to SymbolTable 2019-10-23 14:47:18 -07:00
greg 82520aa28d Start to add source map insertions 2019-10-23 14:47:18 -07:00
greg 129d9ec673 A bunch of infrastructure for keeping track of AST node locations
Plus a failing test to illustrate the reason we care
2019-10-23 14:47:18 -07:00
greg 7825ef1eb9 Partial module work 2019-10-23 14:47:18 -07:00
greg f3ecdc61cb Remove old TODO 2019-10-23 02:22:10 -07:00
greg bf59e6cc63 Just import all of AST in parse tests 2019-10-22 03:15:41 -07:00
greg c560c29b2d Start to add module syntax 2019-10-22 03:15:14 -07:00
greg 4dcd9d0198 Some more parse trace improvements 2019-10-22 02:11:49 -07:00
greg 7ac63160c5 Remove extraneous debug print 2019-10-21 19:19:48 -07:00
greg 8656992945 Made parse trace output a bit nicer
Used ... instead of whitespace, removed extraneous "Production"
2019-10-21 19:18:47 -07:00
greg bb87a87848 Remove this TODO; default args are parsed 2019-10-21 10:53:17 -07:00
greg 2f467702e3 Use common scope resolver
So that if you import something at the repl, it stays imported
2019-10-21 04:19:26 -07:00
greg 5ac5425fac Use symbol table handle in resolver 2019-10-21 04:17:30 -07:00
greg 944916d6af Alias for symbol table handle type 2019-10-21 04:09:43 -07:00
greg 3906210db8 Fix prelude 2019-10-21 03:26:38 -07:00
greg f7357d4498 Add explicit panic for prelude errors
Maybe I want to handle this better in the future, but for now just panic
if the prelude is bad for some reason.
2019-10-21 03:25:45 -07:00
greg 1493d12a22 Reduce unused imports 2019-10-21 03:02:11 -07:00
greg 016d8fc900 Fixed tests
but I think importing is still not working properly
2019-10-21 02:56:21 -07:00
greg 86dc5eca02 Get rid of symbol segment kind
I don't think I need this after all
2019-10-18 18:24:57 -07:00
greg e75958c2a2 Currently broken import all commit 2019-10-18 09:55:26 -07:00
greg 7a56b6dfc0 Add some more methods around this 2019-10-18 09:54:56 -07:00
greg f9633ebe55 Add (broken) import all test 2019-10-18 09:53:44 -07:00
greg 854740a63f SymbolTrie 2019-10-17 03:15:39 -07:00
greg ca10481d7c Symbol table test - multiple values 2019-10-16 22:46:58 -07:00
greg 26fa4a29ec Put type names into symbol table 2019-10-16 20:22:40 -07:00
greg 97b59d7e70 Symbol table tests to separate file 2019-10-16 19:51:43 -07:00
greg 92ad4767c8 Remove some extraneous code 2019-10-16 10:39:48 -07:00
greg 7cabca2987 Got all tests passing with visitor scope-resolver 2019-10-16 02:46:32 -07:00
greg 98e53a6d0f Start porting ScopeResolution to use Visitor pattern 2019-10-15 19:06:07 -07:00
greg 77cc1f3824 ASTVisitor imports 2019-10-15 19:03:27 -07:00
greg 9e64a22328 Invocation argument in visitor 2019-10-15 18:58:51 -07:00
greg 5afdc16f2e Still more visitor work 2019-10-15 03:51:36 -07:00
greg f818e86f48 More visitor work 2019-10-15 00:53:21 -07:00
greg 5a01b12d9b Add note about pattern synonyms 2019-10-13 16:50:54 -07:00
greg 7c75f9b2a8 Extraneous comment 2019-10-11 18:45:52 -07:00
greg 2c34ab52c4 Make this test conform to new if syntax 2019-10-11 09:13:09 -07:00
greg 44d1f4692f Add back parser restrictions 2019-10-11 09:11:14 -07:00
greg 3cf3fce72d Fixed some code in scope resolver 2019-10-10 18:33:34 -07:00
greg ddea470ba8 Parsing tests pass, eval ones fail 2019-10-10 18:17:59 -07:00
greg 745afe981a Got compilation working again 2019-10-10 17:50:20 -07:00
greg a6c86d6447 Some work 2019-10-10 17:06:41 -07:00
greg 8d3639ab8e Fix everything if-refactor-related save reduced_ast 2019-10-10 14:38:48 -07:00
greg 3bca82a8c8 Still more refactoring work 2019-10-10 10:34:54 -07:00
greg 811c52c8d3 More if-expr refactoring work
Think I finished all the parsing stuff, just need to fix the types
everywhere else
2019-10-10 03:56:35 -07:00
greg 95e278d1b5 Chunk of work on if-expr AST
don't expect this to compile yet
2019-10-10 03:29:28 -07:00
greg 61b757313d Alter grammar of if-blocks 2019-10-10 02:34:56 -07:00
greg 24b48551dc More playing around with syntax for if 2019-10-09 02:32:41 -07:00
greg 2ed84de641 Introduce bare else clause in if exprs
With a non-passing test
2019-10-09 01:50:32 -07:00
greg 22efd39114 Change if-expr syntax
use else instead of ->
2019-10-08 18:23:16 -07:00
greg a48bb61eb3 Get rid of this test
need to rethink how if-expressions should work
2019-10-05 16:41:51 -07:00
greg 904d5c4431 Add "production" line to parse debug output
And also add a .next() in the parser that should've been there
2019-10-04 03:12:09 -07:00
greg 28056b1f89 Add production name in ParseError
for debugging
2019-10-04 03:12:00 -07:00
greg f9a59838b0 Get rid of .into()'s in parser 2019-10-01 02:19:12 -07:00
greg f02d7cb924 Add test for failing if expression 2019-09-28 17:42:22 -07:00
greg 489819a28e Multiline prompt 2019-09-28 17:31:37 -07:00
greg c427646e75 Change type alias 2019-09-28 02:42:18 -07:00
greg f06b5922de Visitor cleanup 2019-09-28 02:37:36 -07:00
greg 253b5d88f0 Finish cleaning up visitor logic 2019-09-28 01:58:22 -07:00
greg f654cd6b50 Start moving all walking logic out of visitor 2019-09-28 01:01:56 -07:00
greg 89649273d8 Still more visitor stuff 2019-09-27 22:34:00 -07:00
greg 9fa4e3797c More visitor stuff 2019-09-27 09:54:24 -07:00
greg c8804eeefb More visitor stuff 2019-09-26 03:26:37 -07:00
greg d80a0036b1 Enough of ASTVisitor to test something 2019-09-26 02:29:35 -07:00
greg 7533c69c49 Add note on visitors 2019-09-26 01:32:33 -07:00
greg 39bb175722 Initial WIP code 2019-09-26 01:31:39 -07:00
greg ae65455374 Add type alias for name scope data structure 2019-09-25 03:26:31 -07:00
greg 1fc028c9fc Make lookup_name_in_scope a method 2019-09-25 03:18:54 -07:00
greg 031ff9fe7e Add top-level variable to schala prelude 2019-09-25 02:54:56 -07:00
greg 5a9f3c1850 Sort symbols in debug 2019-09-25 02:43:07 -07:00
greg 58251d3f28 Use colored in symbol table debug 2019-09-25 02:28:24 -07:00
greg 2e42313991 add_new_symbol clarification 2019-09-25 02:18:36 -07:00
greg 355604d911 Cargo.lock should be version-controlled 2019-09-25 01:54:14 -07:00
greg 0b57561114 Use block in scope resolution 2019-09-25 01:45:02 -07:00
greg dbd81ca83d names 2019-09-24 19:24:07 -07:00
greg 6368d10d92 Rename Symbol.name -> Symbol.local_name
to make it clearer what this means
2019-09-24 18:56:53 -07:00
greg 9cd64d97a5 Isolate import handling code 2019-09-24 18:42:01 -07:00
greg 41cad61e34 Start work on name resolution 2019-09-24 03:28:59 -07:00
greg a054de56a2 Import statement syntax 2019-09-21 02:30:28 -07:00
greg 603ea89b98 Start adding import keyword 2019-09-20 18:19:29 -07:00
greg 06026604cc Fix test 2019-09-20 12:14:15 -07:00
greg 03f8abac6a Remove Meta type 2019-09-20 12:03:42 -07:00
greg fd3922d866 Get rid of Meta from tests 2019-09-20 10:10:57 -07:00
greg 71b3365de2 Remove all the rest of the instances of Meta from the AST
Still need to do tests
2019-09-20 02:21:39 -07:00
greg cf9ce74394 still more meta's 2019-09-20 02:05:57 -07:00
greg f5d1c89574 Kill more Meta's 2019-09-20 02:03:10 -07:00
greg 8d1e0ebdea Start to get rid of Meta 2019-09-20 01:57:48 -07:00
greg 69c215eac9 Get rid of Meta elsewhere 2019-09-20 01:44:20 -07:00
greg 8a34034819 Symbol table map for NamedStruct 2019-09-20 01:36:58 -07:00
greg 403b171c72 remove another meta-use 2019-09-20 01:08:00 -07:00
greg e5a09a6ee8 Get rid of Meta use in reduce_named_struct 2019-09-19 18:38:15 -07:00
greg e1a83b5de3 Start to use table lookups instead of Meta
For fqsn
2019-09-19 03:34:09 -07:00
greg 8b1dd561f2 Add get_fqsn_from_id opposite lookup method 2019-09-19 03:06:49 -07:00
greg 6ebe893acb Add id_to_fqsn table on symbol table 2019-09-19 02:58:52 -07:00
greg c9052e0a3b QualifiedName with id 2019-09-19 01:34:21 -07:00
greg 56e6eb44f9 Finish adding ItemId to Expression 2019-09-18 14:15:05 -07:00
greg 642f21d298 WIP commit - adding ItemId to Expression 2019-09-18 10:09:33 -07:00
greg c12cb99b24 ItemId on statement 2019-09-18 10:07:20 -07:00
greg 8dc8833eb3 Item Id store 2019-09-18 09:56:11 -07:00
greg b517bc2366 Add ItemId type to AST 2019-09-18 02:15:45 -07:00
greg 73519d5be5 Add derivative crate 2019-09-18 01:58:38 -07:00
greg 8b6de6961f ItemId type 2019-09-18 01:52:43 -07:00
greg 3eaeeb5509 Begin deprecating Meta in favor of an ItemId 2019-09-17 14:32:15 -07:00
greg b91c3c9da5 Change design of Statement AST node 2019-09-17 02:25:11 -07:00
greg 08da787aae Make AST a struct 2019-09-11 19:25:12 -07:00
greg d6f2fe6e02 Mark TODO done 2019-09-11 01:28:33 -07:00
greg a85d3c46bd Finish conversion of AST Reducer 2019-09-11 01:27:52 -07:00
greg 25f51a314d Start transitioning design of ast reduction
to method-on-struct based system
2019-09-10 09:27:33 -07:00
greg 6c3a4f907b Warning cleanup, TODOs 2019-09-10 03:40:41 -07:00
greg 22887678bd Remove lookup_by_name 2019-09-10 03:35:11 -07:00
greg 1ecf1e506c Update more notes 2019-09-10 03:33:28 -07:00
greg 72944ded1b Fixed all broken tests 2019-09-10 03:31:23 -07:00
greg b65779fb93 Add symbol_table to scope_resolution 2019-09-09 18:12:14 -07:00
greg 418d77770f Start adding symbol_table to scope resolution 2019-09-09 17:45:34 -07:00
greg 5572e0eebb Make some notes about what to do next 2019-09-09 10:17:46 -07:00
greg 65bc32b033 Fixed many of the broken tests 2019-09-09 01:04:46 -07:00
greg 29f4060a71 VarOrName fix in reduced ast 2019-09-08 17:01:07 -07:00
greg 09dbe5b736 Rename function 2019-09-08 04:27:04 -07:00
greg cfa65e5339 Wire up all the qualified names 2019-09-08 02:11:15 -07:00
greg 9a28ccfd85 Tests compile again 2019-09-07 19:08:50 -07:00
greg ea542192be Temp qualified names work 2019-09-06 17:19:41 -07:00
greg 79635f2f86 Add Meta annotation to QualifiedName 2019-09-06 10:03:50 -07:00
greg 2b5b1589b0 tests compile, 15 fail 2019-09-06 02:30:18 -07:00
greg 44c073320b Code builds, tests don't 2019-09-06 02:23:04 -07:00
greg c04e4356a1 Changing how patterns work
to support qualified names in patterns
2019-09-04 10:53:52 -07:00
greg 24e0ecbe73 partial work 2019-09-03 21:14:12 -07:00
greg fd66a9711d More work on fully-qualified names 2019-09-03 10:23:38 -07:00
greg a5c9aca4d7 Halfway done with fqsn lookup pass initial work 2019-09-03 03:20:17 -07:00
greg cefaeb1180 Make ScopeResolver struct 2019-09-03 02:59:19 -07:00
greg 724237545f Start work on scope resolver 2019-09-03 02:19:37 -07:00
greg 0f7f5cb416 Add new stage scope-resolution 2019-09-03 01:42:28 -07:00
greg b4da57f5c5 Make Meta<Expression> exist everywhere it needs to 2019-09-02 14:41:09 -07:00
greg 8b87945bee Wrap remaining Expressions in Meta 2019-09-02 14:13:53 -07:00
greg f96469178d Tests for qualified names 2019-09-01 01:07:00 -07:00
greg 34abb9b081 Start work on qualified names 2019-08-31 23:39:01 -07:00
greg 89d967aee4 FullyQualifiedSymbolName string representation 2019-08-30 22:55:59 -07:00
greg 0540df4024 Rename Val -> Sym 2019-08-30 19:10:16 -07:00
greg 61182a847f Rename lookup_by_path -> lookup_by_fqsn 2019-08-30 19:05:01 -07:00
greg f6dcd7f0b8 Use proper symbol_table lookup in eval 2019-08-30 19:03:52 -07:00
greg 16dc973aa6 Remove one use of symbol_table.lookup_by_name
Should aim to remove it entirely
2019-08-30 18:56:16 -07:00
greg 611e46938d Make symbol names better
Refactor of symbol table to make name lookups
more precise, necessary for struct member lookups
2019-08-30 18:41:47 -07:00
greg 3d6447abb4 Start work on symbol table lookup by type name 2019-08-21 10:10:57 -07:00
greg a74027bb1f Start adding object access 2019-08-20 00:20:07 -07:00
greg 583e87c19a Make apply_builtin compatible with Node 2019-08-19 21:49:46 -07:00
greg 12ed2f5c8e Pass symbol table reference to `to_repl` 2019-08-19 19:38:24 -07:00
greg 3caf9c763c Move eval tests 2019-08-16 10:39:21 -07:00
greg cd20afc3c7 Add note about Nodes 2019-08-15 08:07:52 -07:00
greg 063a13f7ff Move BinOp into ast subcrate
now builtins is only builtin semantics and has nothing to do with
operators
2019-08-15 06:28:40 -07:00
greg b0a1f3337c Clean up some operator code 2019-08-14 10:31:07 -07:00
greg 2e147e141e Update a bunch of schala-lang libraries 2019-08-14 10:18:35 -07:00
greg 44938aa4e6 Starting to refactor binop 2019-08-14 09:26:08 -07:00
greg 44ae10b7ae Add todo note 2019-08-14 07:54:39 -07:00
greg fa1544c71f Fix eval of negatives 2019-08-14 07:31:59 -07:00
greg fde169b623 Make operators live in a submodule of ast
Starting with PrefixOp, BinOp happens next
2019-08-14 07:25:45 -07:00
greg 6e92b03f81 Add types for (some) builtins 2019-08-13 04:28:21 -07:00
greg 0dd6b26e5a Move where PrefixOp lives 2019-08-13 04:17:17 -07:00
greg a3bb3ee514 Note a bug 2019-08-12 14:13:20 -07:00
greg 7ae41e717d Switch away from string builtins 2019-08-12 14:10:07 -07:00
greg 24089da788 Mapping names to builtins 2019-08-12 13:49:39 -07:00
greg bfb36b90e4 Start refactoring how builtins work
Create an enum of builtin operations to start with
2019-08-12 13:10:22 -07:00
greg e750247134 Successfully constructing a record
Not yet destructing it
2019-08-12 12:46:18 -07:00
greg a8efe40b57 Add some documentation for the reduced AST 2019-08-12 11:55:35 -07:00
greg dae619c6fa Add notes 2019-08-12 11:40:31 -07:00
greg c9bfa2b540 More named struct reduction work 2019-08-12 11:40:16 -07:00
greg e708c728d2 Add a type to the prelude to test records 2019-08-12 11:33:03 -07:00
greg b65d6e4c8e Symbol table notes to self 2019-08-12 11:27:16 -07:00
greg d9eca8ffb3 Handle records more properly in symbol table 2019-08-12 11:18:03 -07:00
greg a600d34712 More work on named struct
commented for now becuase I need to fix things in the symbol table
2019-08-12 10:59:04 -07:00
greg aae2ee53cd More parsing debugging changes 2019-08-12 09:51:36 -07:00
greg bf3dcc18d0 Fixed trace parsing debug output 2019-08-12 09:34:36 -07:00
greg baf499ee5a Fix symbol-table debugging 2019-08-05 03:37:37 -07:00
greg 3b19fc5aa9 Barest beginning of named struct implementation 2019-08-05 03:35:10 -07:00
greg 16bf166fa9 Fix bug with debug specifications 2019-08-05 03:31:10 -07:00
greg d832583ed9 Fix pluralization wording 2019-08-05 01:11:01 -07:00
greg 87ecc6f0cb Don't print out bare constructor
Instead convert to PrimObject
2019-08-05 01:07:48 -07:00
greg ee87695626 Simplify Alternative data structure
Most of the subfields are duplicated on Subpattern so just use that
directly
2019-07-30 01:33:09 -07:00
greg 37c77d93d7 Fix off-by-one error in show-immediate parsing 2019-07-28 11:26:13 -07:00
greg b62968379a Replace matches with functional constructs 2019-07-28 11:15:28 -07:00
greg aa705b4eee Break out actual lib.rs functionality
To minimize the amount of meaningful text in files with generic names
2019-07-11 19:21:23 -07:00
greg d67ccf5c7a Refactor Expression struct
to have explicit kind and type_anno fields, to make it clearer
that this represents source-code level annotation and not any kind
of type inference intermediate product
2019-07-10 18:52:25 -07:00
greg d9330bed26 Upgrade linefeed version 2019-07-09 01:49:07 -07:00
greg efe65edfe6 Put color into debug output 2019-07-09 01:32:38 -07:00
greg 7c9154de53 Refactor computation responses 2019-07-08 21:02:07 -07:00
greg 10e40669b5 Fix parsing debug options again 2019-06-22 12:33:28 -07:00
greg ca37e006b9 Fix some dyn's 2019-06-21 02:01:46 -07:00
greg 6d3f5f4b81 Got things compiling again
But this is a bad design for the DebugAsk
2019-06-19 10:41:20 -07:00
greg e3bd108e6c Debug stuff 2019-06-19 03:27:18 -07:00
greg 2ec3b21ebf Make output_wrapper more concise 2019-06-18 18:11:17 -07:00
greg b6e3469573 Default argument to function 2019-06-16 21:36:59 -07:00
greg 32fe7430a4 Equals should be a token type 2019-06-16 16:07:27 -07:00
greg c332747c3e Move parse test code into separate module 2019-06-16 15:03:34 -07:00
greg 33c2786ea1 More complicated FormalParam type 2019-06-16 14:56:52 -07:00
greg 30498d5c98 Add reference work 2019-06-16 00:22:18 -07:00
greg bc01a5ded8 Make reduced ast call handler be a separate method 2019-06-16 00:21:39 -07:00
greg 71386be80e Make tests pass by using multiple-k lookahead 2019-06-14 02:28:14 -07:00
greg ccdc02bbd0 Peek multiple tokens ahead 2019-06-14 01:30:53 -07:00
greg 3a207cf7a7 Make TokenHandler use an array and index
Instead of a peekable iterator, so I can implement LL(k) parsing
2019-06-14 00:44:54 -07:00
greg 66f71606ef Add back some debugging for parsing 2019-06-14 00:23:47 -07:00
greg 53ce31ea8c Start creating new TokenHandler infra
on top of old stuff
2019-06-14 07:21:32 +00:00
greg 4c688ce8b2 Lol grammar is no longer LL(1)
need to fix
2019-06-13 02:27:11 -07:00
greg 40579d80ce More work on args
not quite done
2019-06-12 03:28:46 -07:00
greg fa1257e2cd Starting work on more complicated call expressions
Probably won't build yet
2019-06-12 00:20:20 +00:00
greg e9fd20bfe5 A few more fixes to EBNF 2019-06-09 01:12:19 -07:00
greg dfbd951aaf Some fixes to the EBNF grammar 2019-06-09 01:08:32 -07:00
greg 6b47ecf2d7 First pass at putting EBNF grammar into rustdoc 2019-06-09 00:01:11 -07:00
greg a8b9f5046e Mark that I changed trait to interface 2019-06-07 18:57:55 +00:00
greg 83e05fe382 Remove unneeded directives field 2019-06-06 23:52:41 -07:00
greg 5271429715 Make help a bit nicer 2019-06-06 23:50:08 -07:00
greg f88f2e8550 More help cleanup 2019-06-06 22:36:44 -07:00
greg 7097775a4a :help command working 2019-06-06 22:21:50 -07:00
greg 32d082e119 Kill duplicate code 2019-06-05 02:54:13 -07:00
greg 376fa1d1d1 Tab completion for help 2019-06-05 02:48:45 -07:00
greg 6fb9b4c2d3 Actually the Top variant is doing something useful 2019-06-05 02:42:34 -07:00
greg f1d1042916 Add help text 2019-06-05 02:36:08 -07:00
greg 207f73d607 Moving help code around 2019-06-04 22:02:51 +00:00
greg 8dc0ad2348 Help function work 2019-06-03 22:18:53 +00:00
greg bb39c59db2 directive-related cleanup 2019-06-02 00:48:59 -07:00
greg 10bfeab7e4 Switch over everything to new directive paradigm 2019-06-02 00:43:55 -07:00
greg fe08e64860 More DirectiveAction conversion work 2019-06-02 00:27:12 -07:00
greg fd517351de Start converting over directives to new format 2019-06-02 00:19:26 -07:00
greg e12ff6f30b Start adding infrastructure to pay attention to actions 2019-06-01 22:17:20 -07:00
greg 176b286332 Add new ReplAction type 2019-06-01 18:41:55 -07:00
greg 3987360f8e Working on directives 2019-06-01 16:56:56 -07:00
greg 78d1e93e4b Put back rudimentary debug output 2019-05-28 03:41:49 -07:00
greg 856c0f95ce Wrap schala pass inputs in token struct 2019-05-28 03:07:35 -07:00
greg 3fa624bef4 Paramaterize debugging steps 2019-05-27 15:06:50 -07:00
greg f27a65018d Fix prompt 2019-05-26 15:22:36 -07:00
greg 548a7b5f36 DebugRequests should be set 2019-05-26 04:16:40 -07:00
greg d80d0d0904 Add some notes 2019-05-26 00:23:54 -07:00
greg 6162bae1ac Per-stage computation times 2019-05-25 22:21:52 -07:00
greg fe7ba339b5 Per-stage timing output 2019-05-25 20:09:11 -07:00
greg 6a232907c5 Kill useless DebugRequest type 2019-05-25 19:31:41 -07:00
greg a8583f6bc4 Separate command_tree module 2019-05-22 03:32:00 -07:00
greg bdee4fe7c6 Fix commandtree debug processing 2019-05-22 03:19:12 -07:00
greg 5cdc2f3d07 Some total-time stuff 2019-05-21 02:52:26 -07:00
greg eb2adb5b79 Moving options around
Showing time
2019-05-21 02:46:07 -07:00
greg 2b407a4a83 Total duration Timing 2019-05-21 02:06:34 -07:00
greg 6da6f6312d Remove more unused code 2019-05-20 22:04:14 -07:00
greg ce2a65b044 Clean up some unused code 2019-05-20 16:10:50 -07:00
greg ffdae14a88 Hook up docs 2019-05-17 18:23:03 -07:00
greg 94ea7bcd09 Starting to use more advanced error output 2019-05-15 03:32:57 -07:00
greg 4ebf7fe879 Remove more unused variables 2019-05-14 11:15:12 -07:00
greg efbeff916a Allow this unused macro
for tests only
2019-05-14 10:51:32 -07:00
greg e9ea7811df Kill unused 2019-05-14 02:12:18 -07:00
greg 198f93c533 Make non-interactive code work again 2019-05-14 01:57:31 -07:00
greg 694c152fcd Kill webapp for now
I might add this back in later but for now I'd have to catch up to so
much rocket that it's easier to just leave it out
2019-05-14 01:51:41 -07:00
greg f8f3095f89 Remove dead code 2019-05-14 00:45:45 -07:00
greg c68c23ed68 Restore option-saving 2019-05-14 00:40:38 -07:00
greg 4f972f20a7 Remove some unused code 2019-05-13 19:55:18 -07:00
greg 9d2e5918af Add some more hindley-milner-relatd blogs to README 2019-05-08 16:24:13 -07:00
greg 14fc2a5d10 Update TODO 2019-04-29 23:57:59 -07:00
greg 2b8e2749a4 Get rid of unneeded mut's 2019-04-29 23:57:38 -07:00
greg 6c369b072f Debug immediate working for symbol table 2019-03-31 01:13:40 -07:00
greg 938c0401d1 Some various work 2019-03-27 02:20:43 -07:00
greg a829fb6cd8 Initial debug-handler function 2019-03-26 19:55:47 -07:00
greg 004b056232 Rearchitect CommandTree
- CommandTree::Term is now terminal only with respect to searching for
the debug-handling function to execute, it can have children that only
the tab-completion code cares about

- CommandTree::NonTerminal never has a function associated with it, if the
processing of a list of repl debug commands gets to a NonTerminal leaf,
there's just nothing to do
2019-03-26 19:43:11 -07:00
greg 8e9b410e02 Clean up TODO list 2019-03-20 01:15:35 -07:00
greg b82eebdeec Fix up readme 2019-03-20 01:09:38 -07:00
greg 153e7977d3 Make function more concise 2019-03-20 00:24:46 -07:00
greg 5b5368ce6f Delete a bunch of comments 2019-03-20 00:18:02 -07:00
greg 7a67890227 List-passes 2019-03-20 00:04:02 -07:00
greg 04253543e9 Move where help is computed 2019-03-19 21:12:10 -07:00
greg 3a98096b61 Add back debug passes command completion support 2019-03-19 19:37:29 -07:00
greg 9476e7039b Doc requests in type system 2019-03-19 19:26:05 -07:00
greg c767402865 Remove some no-longer-necessary indirection 2019-03-19 19:16:41 -07:00
greg 61972410ea Functionality to request/respond to meta items 2019-03-19 19:12:32 -07:00
greg d3f9430a18 Avoid unnecessary String 2019-03-19 19:01:04 -07:00
greg 81323cafd4 Change wording of default repl_request handler 2019-03-19 18:46:24 -07:00
greg 14c08bbcdb Get rid of EvalOptions
and associated types
2019-03-19 18:40:21 -07:00
greg 4319c802f5 Add nonterminal with function
For making tab completion work properly
2019-03-19 04:28:54 -07:00
greg 9e58e3d7de Remove some warnings 2019-03-16 18:45:40 -07:00
greg ac0050e5d1 Truncate command list passed to command function
Only pass it the arguments after its own path, if any exist
2019-03-16 18:33:31 -07:00
greg d06cf90fce Help message 2019-03-16 11:34:52 -07:00
greg 712da62d35 Make new CommandTree paradigm work 2019-03-16 11:10:39 -07:00
greg 57f3d39ea1 Start adding commandtree abstraction 2019-03-14 21:25:37 -07:00
greg 6d88447458 Add func to command 2019-03-14 04:08:32 -07:00
greg 0451676ba7 start adding functions to command data structure 2019-03-14 03:47:56 -07:00
greg 2929362046 Change NewRepl -> Repl 2019-03-14 03:42:39 -07:00
greg 375db28ebb Remove support for non-Schala languages
I may come back to these, but not until after Schala is much better
developed
2019-03-14 01:04:46 -07:00
greg 1622a6ce44 Grand culling
Deleting a bunch of old code related to the old way the interpreter
worked
2019-03-14 00:51:33 -07:00
greg 7e899246e9 More refactoring in main Schala driver 2019-03-14 00:15:13 -07:00
greg 8610bd7a87 Port Schala to new framework
Evaluating a Schala function in the REPL works again with no debug info
2019-03-13 22:43:44 -07:00
greg 70f715fbb2 Fix bugs 2019-03-13 20:07:41 -07:00
greg 7360e698dd More work 2019-03-13 10:10:42 -07:00
greg 5b35c2a036 Add new types for ProgrammingLanguageInterface 2019-03-13 00:13:39 -07:00
greg 8d8d7d8bf8 More misc changes including edition 2018 2019-03-12 02:39:25 -07:00
greg 981d4f88bf Changes 2019-03-12 01:14:41 -07:00
greg 42aa316a23 Fix custom attribute thing
Upon updating rust version, the unrestricted_attribute_token thing
broke, but I'm changing this anyway so whatever
2019-03-12 01:05:10 -07:00
greg 58b37e56ae Correct typo in TODO 2019-03-11 20:05:45 -07:00
greg 2bf777f37b Add this note to self 2019-03-11 19:36:10 -07:00
greg bdcae36b60 More cleaning up of how scopes are stored
on Symbol
2019-03-11 02:47:47 -07:00
greg dbcd2278a6 Renamings 2019-03-11 02:35:42 -07:00
greg 2490aaf3f4 Add types necessary for refactor of Symbol table 2019-03-11 01:36:11 -07:00
greg d4ad97b39a start preparing to get rid of symbol_table.lookup_by_name 2019-03-10 17:32:47 -07:00
greg 24213070a3 Delete useless comment 2019-03-10 17:29:02 -07:00
greg 051669b4cc Stuff pertaining to variant scoping 2019-03-10 17:24:58 -07:00
greg c64f53a050 Detect duplicate variable declarations correctly
Later I'll probably want to make it so that you can explicitly override
the value of a declared variable
2019-03-10 17:02:01 -07:00
greg 8f176543c7 Nested scopes in symbol table 2019-03-10 16:04:20 -07:00
greg 9716b5e55b Symbol table detects some duplicate symbols 2019-03-08 03:57:32 -08:00
greg 956353cd80 Move rc! macro to util
So it can be used anywhere
2019-03-08 01:15:19 -08:00
greg 98db60498a Add very basic symbol table test shim 2019-03-07 23:51:31 -08:00
greg 7694afc9e2 Add type for talking about symbol paths
to symbol table
2019-03-07 20:45:12 -08:00
greg 0bcd7e6f41 Add new_env method
This is basically the same as the one on the evaluator and makes use of
the ScopeStack - maybe need to generalize this more?
2019-02-27 02:15:19 -08:00
greg d515b1658a Some fixes 2019-02-24 16:24:45 -08:00
greg e501f4bd10 Various cleanup of comments, stringifying types 2019-02-23 09:59:41 -08:00
greg 5bac01cf20 More boilerplate for apply 2019-02-23 03:55:46 -08:00
greg 0e9b3229e9 Refactor Arrow; add general handle_apply 2019-02-23 03:33:56 -08:00
greg b709cfd51a Start adding call 2019-02-23 02:50:11 -08:00
greg e34295a6f7 Starting on lambda typechecking 2019-02-23 02:45:11 -08:00
greg 8dc34e4b49 Fresh type var 2019-02-23 01:27:32 -08:00
greg 2cc3367666 Unify var-var 2019-02-23 01:20:19 -08:00
greg 452f2ab188 Unify var-const 2019-02-23 01:18:15 -08:00
greg be175a2b75 Add more infrastructure for unify 2019-02-23 00:59:58 -08:00
greg 00a0de4431 Add ena crate for unification 2019-02-23 00:34:44 -08:00
greg f041cc17d2 Wrap all Expression nodes in Meta<Expression> 2019-02-21 23:35:18 -08:00
greg 95fe1941a1 Kill some unused items 2019-02-21 18:39:41 -08:00
greg b35262c444 Rename Node -> Meta 2019-02-21 01:49:15 -08:00
greg 9bb3a2be88 Add type data handle on Node
I think the way I want to handle this is a two-step process: first infer and
fill in variables, then unfiy in a separate step. Storing the data in
the AST is handy.
2019-02-21 01:46:27 -08:00
greg 9fa0576547 Rename ExpressionType -> ExpressionKind 2019-02-21 01:26:51 -08:00
greg 6fba0cc5b4 Add variables 2019-02-21 01:17:34 -08:00
greg a6eb2b4020 Allow type annotations in let expressions 2019-02-20 22:44:45 -08:00
greg 03793e08d3 Typechecking operators 2019-02-20 03:27:46 -08:00
greg 2be55958f4 add Into<String> arg for error constructors 2019-02-20 02:06:58 -08:00
greg bcf48d0ecb First tests for typechecking 2019-02-20 01:33:45 -08:00
greg f0ed63ccf3 Basic if-statement checking 2019-02-19 23:00:41 -08:00
greg 6012bd1087 Variables 2019-02-19 21:41:07 -08:00
greg 866c9211f9 Add resources to README 2019-02-18 23:49:34 -08:00
greg df7e74c79d Types with arguments 2019-02-17 04:31:02 -08:00
greg abbd02eaef Use ty! macro 2019-02-17 04:25:38 -08:00
greg 993741e67f Get rid of typecheck_ 2019-02-17 04:08:49 -08:00
greg fbb7b995b8 Rename mk_type! to ty!
Doesn't seem to conflict with the same macro in the parser tests, so should be
ok
2019-02-17 03:38:15 -08:00
greg 9d4f086a04 Put mk_type! in typechecking module 2019-02-17 03:36:12 -08:00
greg e38ae1c3f1 Fix type to make it compile 2019-02-15 21:11:50 -08:00
greg d969d573fa Starting work on values 2019-02-12 21:14:13 -08:00
greg 35da1748f0 Some more type work 2019-02-10 12:21:12 -08:00
greg 5e1799268d Unification works with bad annotations 2019-02-10 07:32:12 -08:00
greg 42a801d346 Rename Order -> Ordering 2019-02-10 07:06:30 -08:00
greg a80e1bd706 Type name infra 2019-02-10 07:05:01 -08:00
greg afd9aa52c5 More infra around unify 2019-02-10 06:53:11 -08:00
greg 5a70784346 Adding unify stub 2019-02-10 06:48:25 -08:00
greg 0dff177e8f Add more literals kill errors 2019-02-10 05:33:55 -08:00
greg cf91f74912 Pass through type info to repl 2019-02-10 05:31:58 -08:00
greg 06e9452718 More type infrastructure
From here on out, I can start playing with concrete code that attempts
to actually typecheck limited syntactic constructs, and see what I end
up with.
2019-02-10 05:24:11 -08:00
greg 7d3ae36058 AST-walking infrastructure 2019-02-10 04:42:30 -08:00
greg e8f1f51639 Move (most of) the type definitions back to typechecking module
Still need to figure out the macro export thing
2019-02-10 04:30:37 -08:00
greg 170cf349d7 Starting typechecking work again 2019-02-09 00:25:12 -08:00
greg f3f1dcc0a4 Add some notes 2019-01-27 22:38:20 -08:00
greg c0111e30bc SymbolTable: Add Record type 2019-01-25 00:57:01 -08:00
greg c225e469ee Change Record variant representation 2019-01-24 20:47:20 -08:00
greg 1ce06bc0ef More symbol-table refactoring 2019-01-20 22:32:58 -08:00
greg 10c3a60515 Some symbol-table refactoring 2019-01-20 00:23:08 -08:00
greg ff73ce7b36 Get rid of LLVM wrapper
Not using this code right now
2019-01-10 20:57:13 -08:00
greg ede8a9076a Simplify some struct definitions 2019-01-10 20:57:13 -08:00
greg a63dcf91b0 Replace // with `quot` 2019-01-10 20:57:13 -08:00
greg 479a098e0f Make note to fix parsing bug 2019-01-09 17:37:51 -08:00
greg 1085b528fe Don't care about case-sensitivity 2019-01-08 02:38:10 -08:00
greg 9b3b5c5541 Token offsets 2019-01-08 02:11:19 -08:00
greg ab8e24a276 ParseError always has token now 2019-01-08 01:04:46 -08:00
greg 09e2d8579d Remove all ParseErrors that don't return the failed token 2019-01-08 01:00:40 -08:00
greg ee7861cbd0 Fix how impl blocks work
This gets rid of a token-less parseerror
2019-01-08 00:51:56 -08:00
greg b88def8a2e Make error msg better 2019-01-07 16:52:46 -08:00
greg 30676722a3 Transition to edition 2018 2019-01-07 13:00:37 -08:00
greg 801c90aaa7 Reorder parameters in pass functions 2019-01-07 02:43:31 -08:00
greg 02667b018c Kill most LLVM references
I'm probably going to refactor this so much, there's no point in keeping
this around
2019-01-07 02:38:15 -08:00
greg 1032c7c7a2 Add note 2019-01-07 01:59:12 -08:00
greg fa295aab28 Show location of error in parse error 2019-01-07 01:52:54 -08:00
greg a0f4abb9a3 Add SourceReference 2019-01-06 01:58:16 -08:00
greg 78b454fb32 Delete this line 2019-01-05 20:41:49 -08:00
greg 5491169d55 Refactor parsing structure
TokenHandler should contain all the methods for actually manipulating
tokens, Parser should only contain the recursive descent methods
2019-01-05 20:23:07 -08:00
greg 2b338fd3c9 Move .next() onto token_handler 2019-01-05 18:29:24 -08:00
greg 821f321261 More Node-wrapping of Expression 2019-01-05 18:11:51 -08:00
greg 846eeae04c More use of Token instead of TokenKind 2019-01-05 17:50:54 -08:00
greg 22f2750853 More use of Token in error messages 2019-01-05 17:40:05 -08:00
greg f2f8ac7ee8 Make parserrors have token 2019-01-05 17:32:49 -08:00
greg d0c5dce92b Use get_kind() 2019-01-05 17:28:35 -08:00
greg 8eda74c9a5 Starting to keep track of locations of errors in file 2019-01-05 17:18:10 -08:00
greg 2efac109ef Node for TupleLiteral 2019-01-05 16:06:55 -08:00
greg 215e2bbb0d PrefixOp have Node 2019-01-05 16:02:30 -08:00
greg 2590def3be More Node-wrapping 2019-01-05 15:54:03 -08:00
greg 879a7de83d Wrap Expression in Node 2019-01-05 15:47:44 -08:00
greg 4c2e0b8a21 Get rid of old code from old ideas 2019-01-05 15:35:51 -08:00
greg 282c42da3c Adding Node intermediate type to AST 2019-01-05 01:44:32 -08:00
greg 87e68988c8 Update some dependencies 2019-01-01 02:22:12 -08:00
greg 7ac97ca6e8 Kill ast_visitor 2018-11-26 01:57:44 -08:00
greg 26a8ff307f Add generic Node type 2018-11-20 03:21:10 -08:00
greg 6be208b51d Minor fix for parsing error messages 2018-11-20 03:03:08 -08:00
greg e00948cad9 Add ast_visitor mod 2018-11-17 02:09:16 -08:00
greg 0af6fed505 Clear up some code a bit 2018-11-17 01:10:23 -08:00
greg 1f527f7949 Rename TokenType -> TokenKind 2018-11-16 23:17:34 -08:00
greg 8680c4faf6 Just some notes for myself about how to redesign the AST type 2018-11-16 15:53:27 -08:00
greg b198984fc5 implement From for Expression-types 2018-11-16 14:06:04 -08:00
greg 58779f8470 Rename method, make sourcemap optional 2018-11-16 12:58:10 -08:00
greg a0fa50392c Fix compile error 2018-11-16 12:46:29 -08:00
greg d357876b16 WIP source map stuff 2018-11-16 04:12:07 -08:00
greg e42f0c644c Introduce source map 2018-11-16 03:56:55 -08:00
greg 2ec7bf3b9a Some initial work on passing token metadata to AST 2018-11-16 03:51:03 -08:00
greg 5147e1a3eb Handle underscores in identifiers 2018-11-15 16:19:53 -08:00
greg 955c073174 Got typechecker unused errors down to one 2018-11-13 02:39:02 -08:00
greg 7c46a29141 Start adding doc comments 2018-11-11 18:04:44 -08:00
greg 0adc761e72 Kill an unimplemented!() 2018-11-11 02:48:51 -08:00
greg b2039a7b67 Parameterize Type type over existential/universal 2018-11-10 16:33:42 -08:00
greg b4c4531e4d Rename for more concision 2018-11-10 14:11:29 -08:00
greg 2d36ad44d6 Converting over types
WIP
2018-11-09 02:50:29 -08:00
greg 21132a369c Paramaterize Type 2018-11-09 02:05:59 -08:00
greg ff0294c56e Typechecking shouldn't fail yet 2018-11-09 02:02:08 -08:00
greg bc80c8f9ad Updated readme some 2018-11-09 01:51:25 -08:00
greg e39356c0e5 Even more type work 2018-11-09 00:21:34 -08:00
greg d44bb02d61 Even more types 2018-11-08 20:30:17 -08:00
greg 9056e9b0e1 More type work2 2018-11-08 02:29:54 -08:00
greg e9b90412ce More type work 2018-11-08 02:12:01 -08:00
greg 65c47c20fc Change name of monad in which type inference happens 2018-11-07 17:01:07 -08:00
greg fab3fb8ec2 More basic types + test 2018-11-07 16:39:32 -08:00
greg 0d5ccd21fe TConst 2018-11-07 15:39:40 -08:00
greg 69b7b9f528 Print out types to REPL 2018-11-07 13:44:28 -08:00
greg 9a09f40222 More typing work 2018-11-07 03:39:31 -08:00
greg 020819550b More typechecking infrastructure 2018-11-06 16:47:34 -08:00
greg 15f9dbe7a6 Typechecking infrastructure 2018-11-06 13:44:52 -08:00
greg 836bed1207 Added janky map to prelude 2018-11-06 03:02:32 -08:00
greg cee5b085d5 Simpler syntax for single param in lambdas
This kind of implies that I might want -> for function types after all,
instead of :
2018-11-06 02:58:57 -08:00
greg 837a55c718 Test for nested function call 2018-11-06 02:42:28 -08:00
greg f4f89b39b6 Handle nested function calls 2018-11-06 02:40:10 -08:00
greg c6b4ed7ee4 Basic lambdas 2018-11-06 01:19:16 -08:00
greg be425860af Starting on lambdas 2018-11-05 21:13:31 -08:00
greg 17e88b33f2 Eval test doesn't need to be a macro
Can be a fn
2018-11-05 21:07:06 -08:00
greg 47f7eb1ef6 Make prelude be separate file 2018-11-05 20:55:03 -08:00
greg 72d0cfe466 More macro test consolidation 2018-11-05 20:52:18 -08:00
greg cea2f63b44 Use macros to make types more concise 2018-11-05 20:12:10 -08:00
greg eec315dd58 Get rid of exprstatement! macro
For shorter exst! one
2018-11-05 19:58:55 -08:00
greg 1e9aa91c5d More concise test macros 2018-11-05 19:57:11 -08:00
greg 9813609ad7 Minor test refactoring 2018-11-05 19:17:53 -08:00
greg 5953d9d815 type annotations on lambdas 2018-11-05 19:10:34 -08:00
greg a74e09c761 Change lambda syntax 2018-11-05 18:51:01 -08:00
greg ad53d4394b Get rid of println 2018-11-05 14:52:51 -08:00
greg 151246e1c5 Test for pattern-matching 2018-11-05 14:11:49 -08:00
greg 77d2826918 Pattern-match on structured objects 2018-11-05 14:01:14 -08:00
greg 1bd48ed5db Fix problem with parsing commas
I should probably rethink how delimited block expressions like if-blocks
(and eventually for-blocks) work
2018-11-05 13:07:08 -08:00
greg c394b81746 More pattern-matching 2018-11-05 04:02:04 -08:00
greg ec29077247 More tuple-matching
Also discovered parser bug
2018-11-05 03:41:03 -08:00
greg 62043ac2d1 Starting on pattern-matching tuples
Lots of duplicated code here
2018-11-05 03:17:03 -08:00
greg bada386979 More work on subpattern matching 2018-11-03 12:53:09 -07:00
greg e71d404071 Finished this refactor 2018-11-02 19:54:04 -07:00
greg cab4702bd6 Refactoring matching - WIP
doesn't work yet
2018-11-01 02:43:47 -07:00
greg ec5a9d457e String patterns 2018-10-31 01:45:16 -07:00
greg bfbc1580aa Make tag optional 2018-10-30 23:36:55 -07:00
greg 2d6c9010b9 More work here 2018-10-30 18:53:34 -07:00
greg f4ff92302f Use subpattern abstraction 2018-10-30 18:46:06 -07:00
greg e88ed97b06 Add subpattern struct 2018-10-30 18:39:25 -07:00
greg b8df09e956 Change eval strategy to use conditional sigil 2018-10-29 01:50:43 -07:00
greg d7f0147a4f Add conditional target placeholder expr 2018-10-28 12:45:45 -07:00
greg f883512882 New abstraction layer in Schala-lang parser
Just for manipulating tokens
2018-10-21 16:33:21 -07:00
greg 37070a6b3e Move pass chain generation from macro to codegen 2018-10-20 18:00:05 -07:00
greg ffe7deb00a Starting to move pass_chain logic into codegen 2018-10-20 15:54:46 -07:00
greg d7baf065fb Changing what method to call to start parsing 2018-10-20 15:41:09 -07:00
greg 6b42f8b8de Change how parsing works 2018-10-20 14:27:00 -07:00
greg d9e67a6341 Delete this old grammar file 2018-10-20 11:43:12 -07:00
greg 7de536ade0 Install failure crate 2018-10-20 11:17:18 -07:00
greg f62b4c6906 Change format of error msg 2018-10-20 11:14:40 -07:00
greg 4679a9fc7f Remove compiler warnings 2018-10-20 00:55:37 -07:00
greg c25354b2c7 Get rid of typechecking code (for now)
I'm tired of seeing the errors. See branch last_commit_with_typechecking
2018-10-20 00:41:56 -07:00
greg 5f8b842bf2 Delete newline 2018-10-20 00:22:41 -07:00
greg fef66e345b Subpattern field 2018-10-19 17:43:22 -07:00
greg e57d33eae7 More work on more patterns
-need to convert guard into a possibly-empty vec
2018-10-19 17:27:06 -07:00
greg dca9ad06c3 Handle HalfExpr closer to correct 2018-10-19 11:02:10 -07:00
greg 354148c5ba rename codegen -> chala-lang-codegen 2018-10-19 09:57:35 -07:00
greg 6219a06d6f Converted all parser methods to use the annotation 2018-10-19 02:56:11 -07:00
greg 3b20b40eb7 Proc macro generated code for parsing seems to work 2018-10-19 02:45:35 -07:00
greg 4ecf63c54d Okay the proc_macro is actually doing something
At the cost of breaking code
2018-10-19 02:36:23 -07:00
greg 3d00667caf Add test for ignored pattern 2018-10-18 15:55:24 -07:00
greg 4b9c7e38dd Rename TypeName -> TypeIdentifier 2018-10-18 13:27:09 -07:00
greg 03317233c6 Don't need this import 2018-10-18 10:31:55 -07:00
greg dff204069f Starting to implement Ignored pattern 2018-10-18 01:54:36 -07:00
greg f2282f0101 case_match_expression split out into its own method 2018-10-18 01:49:42 -07:00
greg 40ccea8c05 Separate assign_expression method 2018-10-18 01:46:30 -07:00
greg cae6f2f768 Rename schala-codegen -> schala-repl-codegen 2018-10-18 01:09:29 -07:00
greg 1be6991f55 Making eval expression method a bit less complex
by splitting it into submethods
2018-10-17 20:46:16 -07:00
greg 1b60bd38ff Add codegen crate for schala-lang 2018-10-17 15:29:32 -07:00
greg 3b20b9e209 Put schala-lang crates into a subdirectory 2018-10-17 14:51:48 -07:00
greg de0e150536 Fix if-block parsing to handle newlines 2018-10-17 13:44:29 -07:00
greg baf51fb147 Boolean patterns 2018-10-17 12:43:09 -07:00
greg dc9e493fa1 Handle more patterns at reduce_ast level 2018-10-16 17:18:03 -07:00
greg d57a8045a9 Rename test helper 2018-10-16 04:11:18 -07:00
greg 50d5176b45 Fix bug add test 2018-10-16 04:10:28 -07:00
greg 501eaeee87 Implement numeric pattern matching 2018-10-16 03:54:08 -07:00
greg 8619c94217 Start handling numeric patterns
Still need to add eval support for this
2018-10-16 01:38:41 -07:00
greg fc7c86be1a nonterminal() constructor function 2018-10-15 21:46:27 -07:00
greg 77e0d639c2 Make repl mod structure more complex 2018-10-15 20:52:34 -07:00
greg 9927a6b1fd Implement custom interpreter directives - and a wtf?
See the comment about &mut self vs &self
2018-10-15 20:29:23 -07:00
greg e8dfc2be34 Refactor codegen some 2018-10-15 20:14:56 -07:00
greg abe2db25b2 full if matching working with basic patterns 2018-10-15 19:54:17 -07:00
greg a99a5f10a4 Indicate index explicitly in SymbolTable debug 2018-10-15 19:37:02 -07:00
greg c52bc65bb9 Change NonTerminal format 2018-10-02 01:37:43 -07:00
greg 819c422cee Start making CommandTree abstraction include implementation 2018-10-02 00:46:23 -07:00
greg 1c11fec803 Move hardcoded string file names into vars 2018-10-01 20:53:41 -07:00
greg a5c3c383dc Reduce some String clones 2018-10-01 20:46:58 -07:00
greg 76046b134a Make timing toggle-able 2018-10-01 02:05:07 -07:00
greg c24223f28e Got rid of some old code 2018-09-29 12:46:52 -07:00
greg 79e02b0999 Tightened code some in codegen 2018-09-29 03:23:48 -07:00
greg 80eb703f5e Finally got get_doc hookup in codegen macro working 2018-09-29 01:20:31 -07:00
greg 4fccff5e27 Working on improved proc_macro handling 2018-09-27 04:07:42 -07:00
greg e934d7bdc5 Update syn, quote libs 2018-09-26 01:41:58 -07:00
greg de199e785a Update versions 2018-09-26 01:33:19 -07:00
greg 81ac918a59 Add some type annos to make this easier 2018-09-25 13:44:06 -07:00
greg 5d4505241a get rid of completed todo 2018-09-22 00:26:38 -07:00
greg f67793308e Part of the work for a doc handler 2018-09-22 00:24:27 -07:00
greg 693766fa59 Proc macros are stable now 2018-09-21 19:46:31 -07:00
greg f9f29dd0dd Move repl stuff to separate file 2018-09-21 19:43:50 -07:00
greg 3c1823510f use get_cur_language() 2018-09-21 19:32:39 -07:00
greg 92078ef7d8 Add :doc interpreter directive 2018-09-21 19:25:58 -07:00
greg 1abbe2e448 Add guard to Alternative
The semantics are:
    -if tag is Some(_), assume the condition is a constructor,
    and compare tags
    - if guard is Some(_), evaluate true/false *after* having
    applied any bound variables

With this, I can technically get rid of bare conditionals now, since
they are the same as an Alternative with a None tag
2018-08-27 12:45:08 -07:00
greg 065bdd6bda Starting custom operators
Can now parse custom operators. Maybe I want to make it so that you
have to put explicit backticks if you define a custom operator,
currently you can just do: fn +() { .. }
2018-08-24 16:49:59 -07:00
greg e125e8b440 Add spaceship operator for getting an ord 2018-08-24 16:29:28 -07:00
greg 8565c7dfb3 Some work on reduced ast pattern 2018-08-24 16:04:18 -07:00
greg f885d5dfb6 Remove type alias 2018-08-22 23:22:08 -07:00
greg b85725125c Start using HalfExp 2018-08-22 16:41:31 -07:00
greg 2d961d6402 Fix other pattern parsing bugs 2018-08-21 20:02:10 -07:00
greg fa7b6ce96b Handle negatives in patterns correctly 2018-08-21 19:57:45 -07:00
greg 5c9180efc2 Some updates to Schala source files 2018-08-20 19:03:55 -07:00
greg 1d5e5aa735 Some type renaming in builtins
Builtins will remain entirely separate from the actual type
representation, whatever that ends up being
2018-08-19 22:00:20 -07:00
greg 2c298c7247 Add warning for undefined operator
In practice this will probably always not typecheck, but it's a valid
parse
2018-08-19 21:40:30 -07:00
greg f00fee0e37 Rename StateStack -> ScopeStack 2018-08-19 21:31:45 -07:00
greg 0d13b5e3bc Preliminary support for binops in if-discriminators
The BNF grammar is a bit more liberal than any successfully-compiled
schala program should be, in that it allows things like `if x < is
pattern`. It's okay if that parses successfully and then is an error at
typechecking.
2018-08-19 21:25:07 -07:00
greg 98f597f00a Implement comparison operators correctly 2018-08-19 21:11:43 -07:00
greg fb71881409 Refactor binop parsing 2018-08-19 20:33:50 -07:00
greg d1c3b4a81b Starting on halfexprs / binops 2018-08-19 18:44:54 -07:00
greg f9181b5786 use expr_or_block where appropriate 2018-08-19 15:58:31 -07:00
greg 0e914cf057 Error message for parsing guards 2018-08-19 15:12:34 -07:00
greg 04ea8c5ebc More unused code removal 2018-08-19 15:06:01 -07:00
greg 492ef4ae19 Clear up some unused code to reduce compile noise
And add some notes to the README
2018-08-19 15:03:41 -07:00
greg 75a7a4499d Added some more cases to the match handling 2018-08-19 10:53:43 -07:00
greg 99e6668c9a Add rusty-tags to .gitignore 2018-08-18 23:28:06 -07:00
greg 1d38a07cf8 Add timing debugging print 2018-08-16 01:47:21 -07:00
greg 0fa844bcf9 Print timing in debug info 2018-08-16 01:43:42 -07:00
greg 97bee58fbe More work with guards 2018-08-15 22:34:04 -07:00
greg 34c2b43371 More work on if matching 2018-08-15 18:32:44 -07:00
greg 88b617de52 More atlernatives work 2018-08-15 11:44:55 -07:00
greg 482674b19a Start on expr_or_block
WIP doesn't work yet
2018-08-15 09:34:00 -07:00
greg a72b387ceb Remove some more dead code warnings 2018-08-14 23:19:27 -07:00
greg 864e932e9f Getting rid of more unused items 2018-08-14 23:09:11 -07:00
greg d7e73be44c Getting rid of some unused warnings 2018-08-14 23:07:00 -07:00
greg 6a548c9086 Keep track of durations of each pipeline stage 2018-08-14 22:56:22 -07:00
greg 0c0690e86e Provide error message here 2018-08-14 21:53:57 -07:00
greg 6d18f80185 Use Result in test 2018-08-14 21:46:48 -07:00
greg 6825de3916 new_frame -> new_scope 2018-08-14 21:45:45 -07:00
greg 1b78fbff82 Tests for basic pattern matching 2018-08-14 21:39:33 -07:00
greg 897c1181a9 Basic pattern matching working 2018-08-14 21:17:43 -07:00
greg 6833bc4f00 Start on CaseMatch eval 2018-08-14 12:43:06 -07:00
greg f2ded78776 ReducedAST: Match -> CaseMatch
makes it easier to grep for
2018-08-14 12:37:18 -07:00
greg 9debdd8d66 Primitive tuple 2018-08-14 02:03:05 -07:00
greg 8067c862f3 Switch out types for evaluator 2018-08-14 00:11:13 -07:00
greg f9c2fc3f9d Make code more concise 2018-08-07 17:09:53 -07:00
greg 5ead1e5d44 NewConstructor -> Constructor 2018-08-05 19:14:02 -07:00
greg 348a6f7c76 More work on pattern-matching
I think I need to entirely change the types in the evaluator.
ReducedAST should only care about NewConstructor (which I gotta rename),
and the evaluator is the only place that an implementation of a
primitive constructed type should live (see Peyton-Jones implementing a
functional langauge p. 70)
2018-08-05 19:11:42 -07:00
greg 5f336ec1a9 Add lookup_by_name to symbol table 2018-08-05 18:19:48 -07:00
greg da59fae0d3 More work on pattern-matching 2018-08-05 18:01:42 -07:00
greg 5b5689accf Changing representation of primitive objects 2018-08-05 17:15:58 -07:00
greg 32acf89814 New Constructor 2018-08-05 16:04:52 -07:00
greg c637a922a9 Start implementing constructors/matches
as per Implementing Functional Programming Languages by Peyton-Jones
2018-08-05 14:23:08 -07:00
greg 42d0aba21c Add index of variants to symbol table
Also new prelude type, just for testing
2018-08-05 13:59:13 -07:00
greg 7548bdbb78 Add note 2018-07-31 01:59:49 -07:00
greg bc6d4d19b5 reduced ast match 2018-07-26 00:52:46 -07:00
greg a2b1b0f953 Pattern-matching in reduced AST 2018-07-26 00:52:46 -07:00
greg 75bf4b5697 Fill out variants to be reduced 2018-07-26 00:52:46 -07:00
greg 35f5a9623a Check out cranelift 2018-07-26 00:52:46 -07:00
greg 98e812968b Fix parsing additional options 2018-07-26 00:52:46 -07:00
greg 250c486143 Fix derive code 2018-07-26 00:52:46 -07:00
greg 38eb065511 Broken proc macro custom derive code 2018-07-26 00:52:46 -07:00
greg 9e24c3b336 update rocket version 2018-07-26 00:52:46 -07:00
greg 1d2f1624a1 Some codegen work to make pass options work 2018-07-26 00:52:46 -07:00
greg 4ca57e4aea Change name of debug options struct 2018-07-26 00:52:46 -07:00
greg 82502ad0ad Move some parsing code around 2018-07-26 00:52:46 -07:00
greg 7d68b2a05a Get rid of code related to old match stuff 2018-07-26 00:52:46 -07:00
greg c2db212c78 Some more guard arm stuff + dealing with split binexp
... in if-blocks. Need to do some re-architecture I think
2018-07-26 00:52:46 -07:00
greg 837a180b09 Starting work on guard arms 2018-07-26 00:52:46 -07:00
greg 5ecd298e6a Record Pattern 2018-07-26 00:52:46 -07:00
greg 8aa33d0872 Starting on guards 2018-07-26 00:52:46 -07:00
greg 21a8868bcf Fixed test 2018-07-26 00:52:46 -07:00
greg 5a91957fa1 Some incomplete parse work 2018-07-26 00:52:46 -07:00
greg 176d43e56f Remove old comment 2018-07-26 00:52:46 -07:00
greg 90ecde89c9 Mutable types 2018-07-26 00:52:46 -07:00
greg 65c2cd521b Mutable types
This bit of syntax is meant for extendable enum types
2018-07-26 00:52:46 -07:00
greg f7dbbddad1 Let and let mut syntax 2018-07-26 00:52:46 -07:00
greg 43ff08b04c Add some debugging info around parse error 2018-07-26 00:52:46 -07:00
greg 00692aa89e Support for underscores 2018-07-26 00:52:46 -07:00
greg 0ec29f6dd0 Fix repl 2018-07-26 00:52:46 -07:00
greg 5e48eb2dee Broken - some pass abstraction work 2018-07-26 00:52:46 -07:00
greg 3597ad4eef Compact parsing 2018-07-26 00:52:46 -07:00
greg 072eab1a80 Thread debug opts around where they need to be 2018-07-26 00:52:46 -07:00
greg 1761d11d36 Infrastructure for adding more debug options 2018-07-26 00:52:46 -07:00
greg d075f613f9 Hook help messages into command data structure 2018-07-26 00:52:46 -07:00
greg ee55729d5f Halfway through implementing help text on CommandTree 2018-07-26 00:52:46 -07:00
greg 17a4028185 Generate commandtree on repl 2018-07-26 00:52:46 -07:00
greg 947f4f2ea6 Command heirarchy for tab completion 2018-07-26 00:52:46 -07:00
greg 41c9dfae06 Some more tab completion work 2018-07-26 00:52:46 -07:00
greg fe1a508e25 Switch back to line feed for better tab completion 2018-07-26 00:52:46 -07:00
greg 55a8cabd7c Some basic pattern stuff 2018-07-26 00:52:46 -07:00
greg 46b6aeb4db Add note to check out Gluon 2018-07-26 00:52:46 -07:00
greg 3c022fc4ef Clarified BNF 2018-07-26 00:52:46 -07:00
greg 0a02c21e70 Some more Patterns work
-at first brush, a pattern is like a single Variant with a list of free
vars
2018-07-26 00:52:46 -07:00
greg 927f427a86 Starting work on patterns 2018-07-26 00:52:46 -07:00
greg 005aba7a10 Test alt form 2018-07-26 00:52:46 -07:00
greg 7882e92ab5 Fix old style if 2018-07-26 00:52:46 -07:00
greg f582ab4eaa Test for new style parsing 2018-07-26 00:52:46 -07:00
greg f2dce38647 Broken, but compiling, move to new if paradigm 2018-07-26 00:52:46 -07:00
greg ba4cd9da39 Kill match keyword + data structures
And add new unified keywords
2018-07-26 00:52:46 -07:00
greg ba319a7bc3 Finally figured out unified condition syntax 2018-07-26 00:52:46 -07:00
greg 2d052d34f7 More thinking 2018-07-26 00:52:46 -07:00
greg 654eeef428 Clarified grammar BNF some 2018-07-26 00:52:46 -07:00
greg a96fbc9592 Fix match expression parsing 2018-07-26 00:52:46 -07:00
greg 196954326e Print bare data constructor 2018-07-26 00:52:46 -07:00
greg 3f2fff276c Constructor eval 2018-07-26 00:52:46 -07:00
greg e6679ff523 Playing around with conditional syntax 2018-07-26 00:52:46 -07:00
greg ebcea685f3 Fix looking up functions 2018-07-26 00:52:46 -07:00
greg 3b9084810e Add constructor reduced ast node; fix test 2018-07-26 00:52:46 -07:00
greg 7809cda240 Pass symbol_table to ast reduce
To distinguish between values and data constructors
2018-07-26 00:52:46 -07:00
greg f1679e83b7 Start trying to fix tests 2018-07-26 00:52:46 -07:00
greg f98d8e2bb0 Move AST into its own module 2018-07-26 00:52:46 -07:00
greg d0a0cc8209 Rename ast_reducing -> reduced_ast 2018-07-26 00:52:46 -07:00
greg 5aa0e10e7a Some ADT work 2018-07-26 00:52:46 -07:00
greg 27729cefdf Some improvements to the thing 2018-07-26 00:52:46 -07:00
greg df76e7c120 Pretty-print type table 2018-07-26 00:52:46 -07:00
greg 889610f0b0 Pretty-print Symbol Table 2018-07-26 00:52:46 -07:00
greg 3beabf4678 Start eval-ing data constructors 2018-07-26 00:52:46 -07:00
greg 25790f8643 Added super-janky prelude capability 2018-07-26 00:52:46 -07:00
greg ff5446af3f Add symbols from symbol table into global type context 2018-07-26 00:52:46 -07:00
greg 5d84153c9e Want bin expressions typed soon 2018-07-26 00:52:46 -07:00
greg 0b0f6b6b50 Symbol table handles functions better 2018-07-26 00:52:46 -07:00
greg 856a360aba Types need handle to symbol table 2018-07-26 00:52:46 -07:00
greg 81ca9ee20f Add some data structures back 2018-07-26 00:52:46 -07:00
greg 4caf8096b3 Add Scheme, TypeEnv, Substitution data structs 2018-07-26 00:52:46 -07:00
greg c65907388d Some ADT work 2018-07-26 00:52:46 -07:00
greg 33c22c8bbc Typecheck values 2018-07-26 00:52:46 -07:00
greg 4a27af2136 Add a super-basic test 2018-07-26 00:52:46 -07:00
greg 07af54b78a More work 2018-07-26 00:52:46 -07:00
greg c4666b82ec Basics 2018-07-26 00:52:46 -07:00
greg 274dd1ccb0 Basic stuff 2018-07-26 00:52:46 -07:00
greg 70ec79c4b3 Lol starting over from scratch again
H-M is hard :/
2018-07-26 00:52:46 -07:00
greg f88d2331e3 printf debugs for problems with function typing 2018-07-26 00:52:46 -07:00
greg c8f961abbf Functions 2018-07-26 00:52:46 -07:00
greg d040d76bfa Start handling function case 2018-07-26 00:52:46 -07:00
greg 887ba46b0b Fix this thing 2018-07-26 00:52:46 -07:00
greg a80db9e4c2 Debug types
WIP
2018-07-26 00:52:46 -07:00
greg c986233a95 Adding bindings seems to work?
I'm playing real fast and loose though
2018-07-26 00:52:46 -07:00
greg bb29df4a73 Variable binding insertion infrastructure 2018-07-26 00:52:46 -07:00
greg 4db3595d7c More work on variables 2018-07-26 00:52:46 -07:00
greg 217ee73fc9 Literals 2018-07-26 00:52:46 -07:00
greg 93309c025e Some work 2018-07-26 00:52:46 -07:00
greg b67512a9e1 Add Infer struct 2018-07-26 00:52:46 -07:00
greg 8e6f605fab Type alias "TypeName" 2018-07-26 00:52:46 -07:00
greg ba4185b0fb Back to including typechecking code in pipeline 2018-07-26 00:52:46 -07:00
greg 7a2a4df297 Clearing out most of the cruft from typechecking 2018-07-26 00:52:46 -07:00
greg 642e9da8ee Move everything symbol-table-related into a separate module 2018-07-26 00:52:46 -07:00
greg cea7427847 put TypeEnvironment on TypeContext 2018-07-26 00:52:46 -07:00
greg 3156c31dfc Variable lookup 2018-07-26 00:52:46 -07:00
greg 2e457cd5e8 First real inferring 2018-07-26 00:52:46 -07:00
greg 843d895f2b infer infra 2018-07-26 00:52:46 -07:00
greg 734c53ce0d Starting to deal with actual expr inferring 2018-07-26 00:52:46 -07:00
greg 3a3b8dd440 TypeEnvironment lives in Infer 2018-07-26 00:52:46 -07:00
greg c96a56a7ac fresh 2018-07-26 00:52:46 -07:00
greg 4017857a3a Unification 2018-07-26 00:52:46 -07:00
greg 131c83b64d Consult these haskell programs from https://github.com/quchen/articles/tree/master/hindley-milner 2018-07-26 00:52:46 -07:00
greg 9e0f8b8a14 InferError 2018-07-26 00:52:46 -07:00
greg 7121624f77 Type Env 2018-07-26 00:52:46 -07:00
greg 48e795decc apply_substitution for PolyTypes
If I made an error it's likely here...
2018-07-26 00:52:46 -07:00
greg a26da934f4 Substitution monotypes 2018-07-26 00:52:46 -07:00
greg 1de1cd9cfd For H-M, add types and some impls 2018-07-26 00:52:46 -07:00
greg 6f639b9030 Type type structure 2018-07-26 00:52:46 -07:00
greg 8f0104ebc7 Deletion 2018-07-26 00:52:46 -07:00
greg 36cd7e080d Even more deletions 2018-07-26 00:52:46 -07:00
greg f48a25779c Lol just get rid of all the old code, start from scratch again 2018-07-26 00:52:46 -07:00
greg 808a1bfc98 Still more deletions 2018-07-26 00:52:46 -07:00
greg c7e46c1cfa KIll any commented code 2018-07-26 00:52:46 -07:00
greg 98cfcfc18d Eval shouldn't be aware of types 2018-07-26 00:52:46 -07:00
greg b4c7ea3d02 Show bindings too in debug 2018-07-26 00:52:46 -07:00
greg e7c89ed840 Some more refactoring 2018-07-26 00:52:46 -07:00
greg b0e38f7f5b Refactor 2018-07-26 00:52:46 -07:00
greg 276662d98a Some code rearrangements 2018-07-26 00:52:46 -07:00
greg e8e9265b26 Use less verbose match syntax 2018-07-26 00:52:46 -07:00
greg cb316a973e Getting back to hindley-milner
First, clear out some of this cruft in the compiler warnings
2018-07-26 00:52:46 -07:00
greg e64861b602 Some eval tests 2018-07-26 00:52:46 -07:00
greg 1673fd1cf9 Fix test 2018-07-26 00:52:46 -07:00
greg c00effcbdd Add _ 2018-07-26 00:52:46 -07:00
greg 8378170fbd Kill comments 2018-07-26 00:52:46 -07:00
greg 7ab385d398 Bring custom ADTs to the repl 2018-07-26 00:52:46 -07:00
greg 9fb148bb02 Make compile again 2018-07-26 00:52:46 -07:00
greg 97df2fa344 I dunno 2018-07-26 00:52:46 -07:00
greg a08134a747 Delete old code in eval 2018-07-26 00:52:46 -07:00
greg 6707b2bb9c debug outputs in order 2018-07-26 00:52:46 -07:00
greg 3ac50f974d Pass around reference to type context in evaluator 2018-07-26 00:52:46 -07:00
greg 160ce95e5f Fully comment example source file 2018-07-26 00:52:46 -07:00
greg afc4281e7f Evaluate function arguments in context before applying them 2018-07-26 00:52:46 -07:00
greg 8d6fea942f Handle function definition before use
And some other ReducedAST - Evaluation niceties
2018-07-26 00:52:46 -07:00
greg 6d93c758a2 Add function to symbol table 2018-07-26 00:52:46 -07:00
greg aff421cd99 Working with symbol table
Note that symbol table is a different object now than the previous
binding table that was used for type-checking. That binding table is not
currently debugged and should be debugged in a separate debug output with
typechecking proper.
2018-07-26 00:52:46 -07:00
greg 493d76da0b Add symbol table data structure to typechecking 2018-07-26 00:52:46 -07:00
greg eb681fbff9 Make parse error message nicer 2018-07-26 00:52:46 -07:00
greg f3e3843528 Decomplexify delimited! 2018-07-26 00:52:46 -07:00
greg 6b90e19eb1 Simplify expect! macro
Ends up printing a debug print, but whatever, will fix later
2018-07-26 00:52:46 -07:00
greg 210ae47c8b Get rid of lambda 2018-07-26 00:52:46 -07:00
greg 70794d8ff1 For expression parsing 2018-07-26 00:52:46 -07:00
greg 532c8c45b4 Parse while expressions
Decided to add while expressions to the language to make for-parsing
easier. Plus some other random notes
2018-07-26 00:52:46 -07:00
greg 24b532df06 This doesn't need to be a closure 2018-07-26 00:52:46 -07:00
greg ac576be604 Trim newline in getline()
Ineffiicent but whatever
2018-07-26 00:52:46 -07:00
greg 6bf106a1a3 Equality 2018-07-26 00:52:46 -07:00
greg 161e47fe91 Add schala source file test.schala 2018-07-26 00:52:46 -07:00
greg 1a58f3b7af Add fizzbuzz source
Next goal will be, to implement enough to make this work
2018-07-26 00:52:46 -07:00
greg 44e585fca2 Conditionals 2018-07-26 00:52:46 -07:00
greg 3f836eb74f Kill some warnings 2018-07-26 00:52:46 -07:00
greg abf25d648d Change repl behavior of strings 2018-07-26 00:52:46 -07:00
greg 1f6e6d9b31 Tuples 2018-07-26 00:52:46 -07:00
greg e2703121d8 Kill unneeded import 2018-07-26 00:52:46 -07:00
greg e5b6b41422 Error msg 2018-07-26 00:52:46 -07:00
greg 6c5e3dea5d Assignment 2018-07-26 00:52:46 -07:00
greg bd8bf1945c Super simple janky input 2018-07-26 00:52:46 -07:00
greg 9e393d2753 Kill old type structure 2018-07-26 00:52:46 -07:00
greg 822420a9d5 Add an eval test 2018-07-26 00:52:46 -07:00
greg 6f8dc9bedd rename IntLiteral -> NatLiteral 2018-07-26 00:52:46 -07:00
greg 3b134d7fb6 very simple code 2018-07-26 00:52:46 -07:00
greg e0cec8b8a6 print, println as builtins 2018-07-26 00:52:46 -07:00
greg 1a84f62818 Kill some old code, make very_simple example print 2018-07-26 00:52:46 -07:00
greg b1966d7199 Function calling works kind of 2018-07-26 00:52:46 -07:00
greg fdbb21990d Retrieve function from memory when called 2018-07-26 00:52:46 -07:00
greg 1011ff08f3 Use new rust 1.26 less verbose syntax 2018-07-26 00:52:46 -07:00
greg 6d8d2aecbd Functions 2018-07-26 00:52:46 -07:00
greg 848306ad1a Reduce defined function 2018-07-26 00:52:46 -07:00
greg e6f0710e41 Debug ast rewrite 2018-07-26 00:52:46 -07:00
greg d7e3f695b7 Very simple source file 2018-07-26 00:52:46 -07:00
greg 78ba4e1ed3 Variable lookup 2018-07-26 00:52:46 -07:00
greg 481afb0f87 Fix debugging and debug eval 2018-07-26 00:52:46 -07:00
greg 01986e7474 starting bindings 2018-07-26 00:52:46 -07:00
greg 9cf5260d4b Use impl Trait to simplify type signatures 2018-07-26 00:52:46 -07:00
greg b50d87b85b Idea 2018-07-26 00:52:46 -07:00
greg 18c8176134 Get rid of unneeded imports 2018-07-26 00:52:46 -07:00
greg 2cb7d35008 Use EvalResult type 2018-07-26 00:52:46 -07:00
greg bd1eed884f State type manipulations 2018-07-26 00:52:46 -07:00
greg 67917612e6 Swap over eval code to new paradigm
While keeping the old code commented for reference
2018-07-26 00:52:46 -07:00
greg b4a16cdc55 Prefix ops 2018-07-26 00:52:46 -07:00
greg 4d5ab95946 Fix bug with / parsing 2018-07-26 00:52:46 -07:00
greg ce71254b69 Implement a lot more ops 2018-07-26 00:52:46 -07:00
greg 065e58f87e Successfully interpreting addition 2018-07-26 00:52:46 -07:00
greg 29cabb119f Save interpreter directives in history 2018-07-26 00:52:46 -07:00
greg 6768cebc48 Literals 2018-07-26 00:52:46 -07:00
greg ec5580d20b prefix op reduction 2018-07-26 00:52:46 -07:00
greg 9de66a9af3 Unimplemented sigil 2018-07-26 00:52:46 -07:00
greg 633b4fe7a4 Nats, some binop reduction 2018-07-26 00:52:46 -07:00
greg 87c3b8e234 Some work 2018-07-26 00:52:46 -07:00
greg 16a463b1a0 Method-style 2018-07-26 00:52:46 -07:00
greg c3be644133 Optional scope name 2018-07-26 00:52:46 -07:00
greg e7615fda8b Add new_frame method 2018-07-26 00:52:46 -07:00
greg 111657b567 Add generic stack data structure
I'll want to use this soon enough
2018-07-26 00:52:46 -07:00
greg c5e8d3e080 Random notes re: symbol table
I'm proably gonna want to redo the symbol table stuff after reading Language
Implementation Patterns, esp. to accomodate scopes
2018-07-26 00:52:46 -07:00
greg 4f49c183b0 Float + reduce binop/prefixop 2018-07-26 00:52:46 -07:00
greg 81368179bb More boilerplate 2018-07-26 00:52:46 -07:00
greg 30128d7d34 Easy work 2018-07-26 00:52:46 -07:00
greg 6c718e5d4f Start AST-reducing 2018-07-26 00:52:46 -07:00
greg 774ddd665b Infrastructure to evaluate reduced AST 2018-07-26 00:52:46 -07:00
greg 0bb0ecea76 Add new ast reducing pass 2018-07-26 00:52:46 -07:00
greg 59a7c11031 Hook --debug flag to new debug framework 2018-07-26 00:52:46 -07:00
greg b54a9774ed Rename schala source files to be clearer 2018-07-26 00:52:46 -07:00
greg a9c0341d38 Half-assed implemention of tab completion
Bah this is boring. Maybe I want to switch back to linefeed?
2018-07-26 00:52:46 -07:00
greg 2d260c14d7 Add Unified Construction Syntax article 2018-07-26 00:52:46 -07:00
greg 7686707602 Type alias Vec<String> -> Block 2018-07-26 00:52:46 -07:00
greg 670833185b Start adding tab completion API 2018-07-26 00:52:46 -07:00
greg 012c50b7c3 Kill commented lines 2018-07-26 00:52:46 -07:00
greg f1a64adfd9 Kill a few lines of code 2018-07-26 00:52:46 -07:00
greg e46eeb91f3 Highlight enabled debug passes 2018-07-26 00:52:46 -07:00
greg d524389f1d Kill old DebugOptions struct 2018-07-26 00:52:46 -07:00
greg 890e6bd4c5 Minor wording changes 2018-07-26 00:52:46 -07:00
greg 8826d5b0d4 For now, don't error out with typechecking 2018-07-26 00:52:46 -07:00
greg 8ad5dd9056 Rename passes 2018-07-26 00:52:46 -07:00
greg fb168da8bd Kill comments 2018-07-26 00:52:46 -07:00
greg 78fdea180e Rename stages -> passes 2018-07-26 00:52:46 -07:00
greg 00e68d09c7 Kill comment 2018-07-26 00:52:46 -07:00
greg 73c3eeb69d stage -> pass 2018-07-26 00:52:46 -07:00
greg 86e88ee1bf Greatly fleshed out custom derive 2018-07-26 00:52:46 -07:00
greg d1a2473bb2 More derive work 2018-07-26 00:52:46 -07:00
greg 57ccdd5ead Extract out attr parsing code into a separate function 2018-07-26 00:52:46 -07:00
greg c0746028f4 Automate language name 2018-07-26 00:52:46 -07:00
greg c6f038a307 Successfully parse language name 2018-07-26 00:52:46 -07:00
greg e498e19ffc Use extra attribute 2018-07-26 00:52:46 -07:00
greg 51cdedb9cc Actually autogenerate the trait 2018-07-26 00:52:46 -07:00
greg 50236ac942 Kill unused mut 2018-07-26 00:52:46 -07:00
greg 1e4554258f KIll old code 2018-07-26 00:52:46 -07:00
greg 9ade0dd1e2 Kill unnecessary mutable 2018-07-26 00:52:46 -07:00
greg 7ba8c9dab9 Tokenize errors 2018-07-26 00:52:46 -07:00
greg 774ab5f72e Token errors WIP 2018-07-26 00:52:46 -07:00
greg 50499c8a33 Remove some compiler warnings 2018-07-26 00:52:46 -07:00
greg a10df92ab8 Debug work 2018-07-26 00:52:46 -07:00
greg fe64cbcd3a Refactor 2018-07-26 00:52:46 -07:00
greg 061d54702f Implement debug stages as a HashSet of strings
Change to custom enum type later
2018-07-26 00:52:46 -07:00
greg 27885500fd Show debug stages 2018-07-26 00:52:46 -07:00
greg aaf98db2b7 Starting to move debug options around
+ add method to ProgrammingLanguageInterface that is a list of stages
2018-07-26 00:52:46 -07:00
greg fff587cd6a Change "set" to "debug" 2018-07-26 00:52:46 -07:00
greg 83fe71f721 Kill old trait infrastructure 2018-07-26 00:52:46 -07:00
greg 491face68b More autoderive things 2018-07-26 00:52:46 -07:00
greg 258e813a39 Starting to write custom derive for ProgrammingLanguageInterface 2018-07-26 00:52:46 -07:00
greg 5d69b530c5 Remove comments 2018-07-26 00:52:46 -07:00
greg 8a5b8619fa Kill old execute method 2018-07-26 00:52:46 -07:00
greg 832d0d4ee3 Add more debug jank entries 2018-07-26 00:52:46 -07:00
greg 57a18a0768 Make (some) stages configurable
This is janky and needs to be more general
2018-07-26 00:52:46 -07:00
greg 2c5ebd636f Pass EvalOptions to macro 2018-07-26 00:52:46 -07:00
greg 06638dc030 Minor syntax changes 2018-07-26 00:52:46 -07:00
greg 3a181dd0ac Add passing debug into via &mut pointer 2018-07-26 00:52:46 -07:00
greg 1d1a5fb6fc Pass mutable handle to unfinishedcomputation 2018-07-26 00:52:46 -07:00
greg fb4de6f2d6 Making use of UnfinishedComputation 2018-07-26 00:52:46 -07:00
greg 18c86c26f0 Passing comp around 2018-07-26 00:52:46 -07:00
greg ac44df8d1e Semicolon 2018-07-26 00:52:46 -07:00
greg 12c7cebb38 Clarify comment 2018-07-26 00:52:46 -07:00
greg f22f089b9b finish method on UnfinishedComputation 2018-07-26 00:52:46 -07:00
greg 3d960d5697 Implement most of pipeline 2018-07-26 00:52:46 -07:00
greg 1f4228b887 Successfully passing state handle to pass functions 2018-07-26 00:52:46 -07:00
greg 5abaadc0ca Add self 2018-07-26 00:52:46 -07:00
greg fd89de77cc Making pipeline macro nicer 2018-07-26 00:52:46 -07:00
greg a305610a39 Some kind of pipeline working
thanks to the rust syn crate guy for the macro idea
2018-07-26 00:52:46 -07:00
greg 14f31a5186 Adding proc macro for codegen
This should hopefully make the compiler pass thing I want to do possible
2018-07-26 00:52:46 -07:00
greg b936132ca6 Backtick operators supported in tokenizing 2018-07-26 00:52:46 -07:00
greg a1016293ac Show artifacts on failure 2018-07-26 00:52:46 -07:00
greg 8e42f7e0bc TODO note 2018-07-26 00:52:46 -07:00
greg b8a25dbaac Put this stuff back
More complicated to separate out repl than I thought
2018-07-26 00:52:46 -07:00
greg 66b6ddcf93 Start refactoring how interpreter options are split up 2018-07-26 00:52:46 -07:00
greg 1c0365529d Swap sigil from . to : 2018-07-26 00:52:46 -07:00
greg f795612884 Want to change 'trait' to 'interface' 2018-07-26 00:52:46 -07:00
greg c9ea48e9d1 Fix history adding 2018-07-26 00:52:46 -07:00
greg 65f42981ff Trait -> Interface 2018-07-26 00:52:46 -07:00
greg e2970dbc42 Kill old advanced_slice_patterns 2018-07-26 00:52:46 -07:00
greg 7d2bc4188d Debug stages from command line 2018-07-26 00:52:46 -07:00
greg eb987bb5b0 Make REPL interpreter return a value 2018-07-26 00:52:46 -07:00
greg 0de504eb9e Kill unused items in schala-repl 2018-07-26 00:52:45 -07:00
greg 635887f7a5 Start killing old code in language 2018-07-26 00:52:45 -07:00
greg ecebbb2eae Fix interspersing of newlines in tokenizer infra 2018-07-26 00:52:45 -07:00
greg 78f12c8f1d Show err output when evaluating non-interactively 2018-07-26 00:52:45 -07:00
greg ebda79e5fd Colored repl command output 2018-07-26 00:52:45 -07:00
greg 819a06503f Hook schala function up to debug booleans
Not sure if I like this API, but eh, it's what I've got
2018-07-26 00:52:45 -07:00
greg 664003a9d7 Add back color 2018-07-26 00:52:45 -07:00
greg e1398bd063 rename schala_main -> repl_main 2018-07-26 00:52:45 -07:00
greg 898b185509 Add version string 2018-07-26 00:52:45 -07:00
greg 7592209cdb Get rid of all top-level dependencies 2018-07-26 00:52:45 -07:00
greg 6f43c3b81d move schala into separate crate 2018-07-26 00:52:45 -07:00
greg 072eeaa127 Color in terminal error output 2018-07-26 00:52:45 -07:00
greg 6bd3ed7b65 Move robo to separate crate 2018-07-26 00:52:45 -07:00
greg 8f19f2e414 Move rukka to crate 2018-07-26 00:52:45 -07:00
greg 5f279cb400 Move maaru into separate crate 2018-07-26 00:52:45 -07:00
greg 795b4adc6b Rename schala-lib -> schala-repl 2018-07-26 00:52:45 -07:00
greg 9d4082463a Removed (for now) LLVMCodeString 2018-07-26 00:52:45 -07:00
greg 43ade31f3e new thing compiles 2018-07-26 00:52:45 -07:00
greg 9f2fbda31f Switch over schala to new system 2018-07-26 00:52:45 -07:00
greg b31325c315 Update schala example code 2018-07-26 00:52:45 -07:00
greg 95a2620754 Nested comments 2018-07-26 00:52:45 -07:00
greg e67b22d109 Changing comments to use //, /* 2018-07-26 00:52:45 -07:00
greg 61eccba173 Starting to improve infrastrucutre for lang output
To make repl vs non-repl output better
2018-07-26 00:52:45 -07:00
greg f56d7120c4 Hacky fix for displaying error output non-interactively 2018-07-26 00:52:45 -07:00
greg 6140de9f9c Some changes necessary to handle non-interactive code 2018-07-26 00:52:45 -07:00
greg b54c71633c Eval list literals 2018-07-26 00:52:45 -07:00
greg 1eeafb80dc Parse list literals 2018-07-26 00:52:45 -07:00
greg 59d621ed75 Tighten some code 2018-07-26 00:52:45 -07:00
greg 76fadf0701 Rename ReplOutput -> LanguageOutput 2018-07-26 00:52:45 -07:00
greg 6e6d494d50 Make directory for schala source files 2018-07-26 00:52:45 -07:00
greg a0bb2837c1 Index evaluation 2018-07-26 00:52:45 -07:00
greg a4dd492c26 Proper index exprs 2018-07-26 00:52:45 -07:00
greg d0b6840670 Some macro simplifications 2018-07-26 00:52:45 -07:00
greg b65eb0e459 Trying to make tests less verbose 2018-07-26 00:52:45 -07:00
greg 3f1e83dfda Added test for lambda call 2018-07-26 00:52:45 -07:00
greg 5ddfc132e7 Changed BNF grammar of call statements
To allow calling lambdas
2018-07-26 00:52:45 -07:00
greg f1f7f43e20 lambdas 2018-07-26 00:52:45 -07:00
greg 86d9e90e7c Print output of tuples 2018-07-26 00:52:45 -07:00
greg a7672171a6 Handle tuple literals in type system 2018-07-26 00:52:45 -07:00
greg 08e10739e5 Sum types in type schema 2018-07-26 00:52:45 -07:00
greg a300f78e19 Kill unused import 2018-07-26 00:52:45 -07:00
greg 0423017125 Kill some compiler warnings 2018-07-26 00:52:45 -07:00
greg 8ef5a28aff Evaluator now only prints when a builtin print is called 2018-07-26 00:52:45 -07:00
greg a92a2e4454 Kill comments 2018-07-26 00:52:45 -07:00
greg 8d79074ea9 Fix bug in delimited macro
Had to do with bad strictness testing.
2018-07-26 00:52:45 -07:00
greg 4e7806d053 Improve tokenizer debug output 2018-07-26 00:52:45 -07:00
greg 507e0b7255 Cleanup 2018-07-26 00:52:45 -07:00
greg 9b760244d5 Include line count in token debug 2018-07-26 00:52:45 -07:00
greg 88e027f536 Munged types to make tokenizer compile 2018-07-26 00:52:45 -07:00
greg 2e41f8ffe3 SOme work
WIP
2018-07-26 00:52:45 -07:00
greg b18c2eee96 Fixed bug w/ lines in functions
Also improved debugging
2018-07-26 00:52:45 -07:00
greg 0c78f50568 Frame-aware lookups 2018-07-26 00:52:45 -07:00
greg 2dc9b4c09f Kill debug 2018-07-26 00:52:45 -07:00
greg 73206d345e Better debugging for types 2018-07-26 00:52:45 -07:00
greg 1a74e16af5 Use UVars in type signatures of functions 2018-07-26 00:52:45 -07:00
greg ae2182db5d Add history saving 2018-07-26 00:52:45 -07:00
greg ad450469a5 Switch to rustyline library 2018-07-26 00:52:45 -07:00
greg df88e33579 Introduced fresh type variable method 2018-07-26 00:52:45 -07:00
greg 9d72a92f0b Continuing work 2018-07-26 00:52:45 -07:00
greg fa6c2a6f45 Re-added symbol table infra 2018-07-26 00:52:45 -07:00
greg 92e6830979 Some logic for function call inferring 2018-07-26 00:52:45 -07:00
greg ef9cd04605 Starting on function application typechecking 2018-07-26 00:52:45 -07:00
greg 1eaf201145 Move some code around 2018-07-26 00:52:45 -07:00
greg 876373c9fd Function calls work 2018-07-26 00:52:45 -07:00
greg 63f5f155ae Temporarily disable type-erroring
and tighten some code
2018-07-26 00:52:45 -07:00
greg 51cf8a4824 Handle variable lookups 2018-07-26 00:52:45 -07:00
greg e0cc12276c Evaluate binding declarations 2018-07-26 00:52:45 -07:00
greg d69970a806 Separate Value and NamedStruct syntactic categories 2018-07-26 00:52:45 -07:00
greg 522d9fc951 Fixed | 2018-07-26 00:52:45 -07:00
greg 63c3e0a4db More operator stuff 2018-07-26 00:52:45 -07:00
greg 547def990e Operator changes 2018-07-26 00:52:45 -07:00
greg 6e105bac55 Fixed tests w/ respect to binop
There's a few unnecessary conversions of &str 's to Rc<String> and back
2018-07-26 00:52:45 -07:00
greg a396c448ec Centralize data for prefix ops too 2018-07-26 00:52:45 -07:00
greg d3ef980dc5 Added type information to binop definitions
Also started centralizing precedence there too
2018-07-26 00:52:45 -07:00
greg df86e0c16e Make sigil field private 2018-07-26 00:52:45 -07:00
greg 274bf80b5d Function evaluation work 2018-07-26 00:52:45 -07:00
greg f0a39ac88a Give State a pointer to its parent
For function call lookups
2018-07-26 00:52:45 -07:00
greg 85e65273fe Finished initial BinOp/PrefixOp 2018-07-26 00:52:45 -07:00
greg 413c5afe67 Starting to munge BinOp types
Incomplete, doesn't yet compile
2018-07-26 00:52:45 -07:00
greg 36174140bc ReplState -> State
Not everythign is a repl
2018-07-26 00:52:45 -07:00
greg 75ecfb4e86 Move bx! macro up to mod.rs
And make use of it in parser
2018-07-26 00:52:45 -07:00
greg e86d401c90 Move anno-to-type to a method on TypeName 2018-07-26 00:52:45 -07:00
greg b2319f0971 Fix tests too 2018-07-26 00:52:45 -07:00
greg d423e88845 Separate tokenizing module
Parsing was getting too long
2018-07-26 00:52:45 -07:00
greg 5cb0e6715d Some work on binops 2018-07-26 00:52:45 -07:00
greg 5bb2c319e8 Some more type-checking work 2018-07-26 00:52:45 -07:00
greg 440783bb64 More work on evaluating applications
for later testing + to kill a compiler warning
2018-07-26 00:52:45 -07:00
greg 9834ee295e Fix traits, silence warnings 2018-07-26 00:52:45 -07:00
greg 9346bb9581 type of a declaration should be Void, not Unit
I think this makes sense

Also kill some compiler warnings
2018-07-26 00:52:45 -07:00
greg f46f593c44 Types in bindings 2018-07-26 00:52:45 -07:00
greg ec7d185ed5 Simplified match 2018-07-26 00:52:45 -07:00
greg 3f1cf1d975 Added trait declaration 2018-07-26 00:52:45 -07:00
greg 39ee550b54 More static type work 2018-07-26 00:52:45 -07:00
greg d5df868f10 Finished basic constant type inference 2018-07-26 00:52:45 -07:00
greg 55629e6d9d More type implementing - WIP
This has a borrowing bug currently
2018-07-26 00:52:45 -07:00
greg 9d99971f49 Fix some integer overflows with binary and hex 2018-07-26 00:52:45 -07:00
greg 76575e9ba3 Starting basic type stuff 2018-07-26 00:52:45 -07:00
greg a50d8d9e3f Starting over with types 2018-07-26 00:52:45 -07:00
greg c2cd419e5a Additional TODO 2018-07-26 00:52:45 -07:00
greg bcec8e27f8 Add todo note 2018-07-26 00:52:45 -07:00
greg e6a015090c More type things 2018-07-26 00:52:45 -07:00
greg c18bf9c29f Type singletons test work 2018-07-26 00:52:45 -07:00
greg cfc507a2df TypeSingletonName broken out 2018-07-26 00:52:45 -07:00
greg f7e88c7cab Fix struct literals in if expressions
With special case-ing, sigh :( Also will need to do this for match
expressions but I'll cross that bridge when I come to it
2018-07-26 00:52:45 -07:00
greg 4d0bfa2a52 Don't need clone() here 2018-07-26 00:52:45 -07:00
greg 99e5d86764 Kill separate is_digit method
I care about 10 vs 16 distinction
2018-07-26 00:52:45 -07:00
greg 17e8ebe789 Hex parsing done 2018-07-26 00:52:45 -07:00
greg 253a85005c Save settings on ctrl-D 2018-07-26 00:52:45 -07:00
greg 967e5cc436 Added a bunch of notes 2018-07-26 00:52:45 -07:00
greg 7a6ace5db1 Fix parse level calculation 2018-07-26 00:52:45 -07:00
greg 129af43e69 Proper indentation of parser debug 2018-07-26 00:52:45 -07:00
greg 17dccf65c8 Move some code around 2018-07-26 00:52:45 -07:00
greg 95c6a23bf1 Better hex literals 2018-07-26 00:52:45 -07:00
greg 2bff53846c Starting hex parsing 2018-07-26 00:52:45 -07:00
greg 514d117c7e Simplify some code 2018-07-26 00:52:45 -07:00
greg ae65687a93 Assign a specific rocket version 2018-07-26 00:52:45 -07:00
greg 9ec983dc20 unified BoolAtom 2018-07-26 00:52:45 -07:00
greg cab0ca6f47 Rukka source file 2018-07-26 00:52:45 -07:00
greg 8f6c80ac8c Print operation 2018-07-26 00:52:45 -07:00
greg 7f546fa879 Refactoring 2018-07-26 00:52:45 -07:00
greg 48a35aa382 Delete some unneeded code 2018-07-26 00:52:45 -07:00
greg 0c64b14be0 Forgot to change name here 2018-07-26 00:52:45 -07:00
greg 5d9fa6679b Name change
builtin -> primitive
2018-07-26 00:52:45 -07:00
greg ea24ae1bb5 Get rid of some printlns 2018-07-26 00:52:45 -07:00
greg 0d2a0e3536 Implement lambda application 2018-07-26 00:52:45 -07:00
greg 339e3464e3 Plus and multiply 2018-07-26 00:52:45 -07:00
greg c35b684bdd Builtins - + 2018-07-26 00:52:45 -07:00
greg d11c518721 Framework for multiple environments 2018-07-26 00:52:45 -07:00
greg 8dde8c7381 Apply wokr 2018-07-26 00:52:45 -07:00
greg 47cad3712c Fixing quote 2018-07-26 00:52:45 -07:00
greg ffcc0ef379 Starting builtins 2018-07-26 00:52:45 -07:00
greg 6766791627 Lambda abstraction 2018-07-26 00:52:45 -07:00
greg 05de5ebe61 Kill this linker thing 2018-07-26 00:52:45 -07:00
greg 98fa8403b3 Flesh out TODO, README 2018-07-26 00:52:45 -07:00
greg ce83306581 Add Rukka to README 2018-07-26 00:52:45 -07:00
greg 29ebd35165 Kill unused code 2018-07-26 00:52:45 -07:00
greg 622b50a40c Some lambda work 2018-07-26 00:52:45 -07:00
greg 9f916c7c02 Remove a unimplemented 2018-07-26 00:52:45 -07:00
greg 85375bb9df Add fn literal variant 2018-07-26 00:52:45 -07:00
greg d11500c643 Even more concise 2018-07-26 00:52:45 -07:00
greg 8493233b69 Refactoring 2018-07-26 00:52:45 -07:00
greg 60644ba3d7 Starting lambdas 2018-07-26 00:52:45 -07:00
greg 254f2ae4b8 Make var methods better 2018-07-26 00:52:45 -07:00
greg e243b99d3b If expressions 2018-07-26 00:52:45 -07:00
greg 3d023a6704 Rukka - Variables 2018-07-26 00:52:45 -07:00
greg 857b77f2e3 Add schala idea 2018-07-26 00:52:45 -07:00
greg 4d89dcc85e Can specify language name with -l in any case 2018-07-26 00:52:45 -07:00
greg fe0e58efe7 Go directly to langauge by name 2018-07-26 00:52:45 -07:00
greg 73612d1465 Define half-working 2018-07-26 00:52:45 -07:00
greg afd2b018f4 Language name in prompt 2018-07-26 00:52:45 -07:00
greg d1a15b64ff Get rid of old import 2018-07-26 00:52:45 -07:00
greg 66e8643382 eq? 2018-07-26 00:52:45 -07:00
greg ad58fc1ad1 True and False primitives 2018-07-26 00:52:45 -07:00
greg adc7be30a9 Some primitive implementations 2018-07-26 00:52:45 -07:00
greg 72097fa125 Fix pointer alias problem 2018-07-26 00:52:45 -07:00
greg ae9d93f6dc Still tryign to make the pointer-munging work 2018-07-26 00:52:45 -07:00
greg 3d421c7039 This has broken sexp parsing 2018-07-26 00:52:45 -07:00
greg 166bc3b3cb Fix print bug 2018-07-26 00:52:45 -07:00
greg 2f263de8ba Convert to more lispish Cons 2018-07-26 00:52:45 -07:00
greg 46ae176498 Special forms list 2018-07-26 00:52:45 -07:00
greg d84def35e7 Unwraps 2018-07-26 00:52:45 -07:00
greg 07e55ca04e Handle top-level empty list 2018-07-26 00:52:45 -07:00
greg 6dcf5c7945 print list 2018-07-26 00:52:45 -07:00
greg 568ee88f3a Tighten code 2018-07-26 00:52:45 -07:00
greg 8749ed984d Some more code 2018-07-26 00:52:45 -07:00
greg 559eaf54de Type simplification 2018-07-26 00:52:45 -07:00
greg bf42b58ca5 State for eval 2018-07-26 00:52:45 -07:00
greg ecdcb7ff3d Numbers 2018-07-26 00:52:45 -07:00
greg 766209e5b2 Fixed string parsing 2018-07-26 00:52:45 -07:00
greg e9429ed62a Strings partway working 2018-07-26 00:52:45 -07:00
greg 6e188976f9 Quotes 2018-07-26 00:52:45 -07:00
greg d235b47bc5 Change Symbol -> Word for token 2018-07-26 00:52:45 -07:00
greg 3fcb840ce5 Fix bug 2018-07-26 00:52:45 -07:00
greg 523bd179a4 Tighten code 2018-07-26 00:52:45 -07:00
greg 35e715dfd6 Intersperse 2018-07-26 00:52:45 -07:00
greg 6eb0fc8834 Parsing correctly yay 2018-07-26 00:52:45 -07:00
greg c0a5418c27 Tokens 2018-07-26 00:52:45 -07:00
greg 42749c1ff6 Sexp parsing 2018-07-26 00:52:45 -07:00
greg 42b9507af0 Parses ( 2018-07-26 00:52:45 -07:00
greg 38e85e2c78 Some halfwritten stuff 2018-07-26 00:52:45 -07:00
greg 7c5fef49f8 List datatype 2018-07-26 00:52:45 -07:00
greg c1e214c701 Add a new language - Rukka
This is a (simple) lisp, partially for fun, partially for testing the
generic interfaces
2018-07-26 00:52:45 -07:00
greg 66e3de41dd Make schala-lib::language private and reexport 2018-07-26 00:52:45 -07:00
greg 9545130fd3 Take TokenError type out of schala-lib 2018-07-26 00:52:45 -07:00
greg ef7412dcd5 I don't need this syntax 2018-07-26 00:52:45 -07:00
greg dee470cb8b Kill some packages from schala bin 2018-07-26 00:52:45 -07:00
greg c057f068ef Get rid of unused imports 2018-07-26 00:52:45 -07:00
greg c4dbdf1fe7 Refactor into libs part II
woo it compiles
2018-07-26 00:52:45 -07:00
greg 4c7174e4c4 Halfway done to library-ifying schala 2018-07-26 00:52:45 -07:00
greg d0538faef3 PLIGenerators can be authoritative, not the instances themselves 2018-07-26 00:52:45 -07:00
greg b97da01370 Some simplification 2018-07-26 00:52:45 -07:00
greg b09efd3660 Passing things along as generators 2018-07-26 00:52:45 -07:00
greg a42a58b155 Don't need mutex, kill it 2018-07-26 00:52:45 -07:00
greg 708c0ab103 Finally removed schala dependency
Now need to clena up everything
2018-07-26 00:52:45 -07:00
greg 1d9d0c4395 Okay this compiles
The secret (from #rust) appeared to be that Fn() needed to have + Send
explicitly annotated on it
2018-07-26 00:52:45 -07:00
greg ffb87ebb82 Working on solution to Rocket state problem 2018-07-26 00:52:45 -07:00
greg 30c741f459 Some linker bullshit
I don't know why I needed to do this
2018-07-26 00:52:45 -07:00
greg d19541b3e1 Splitting up some code
In preparation for splitting schala into crates
2018-07-26 00:52:45 -07:00
greg 3651461bbc Some more structure in evaluator 2018-07-26 00:52:45 -07:00
greg 7730457878 Revert "Starting to split project into multiple crates"
This reverts commit e3b0f4a51e.
Bah, this was a bad idea, wrong way to do it
2018-07-26 00:52:45 -07:00
greg 46dbac7f69 Starting to split project into multiple crates 2018-07-26 00:52:45 -07:00
greg f68167f3a2 Halfway done with evaluating tuples 2018-07-26 00:52:45 -07:00
greg c9625ffa77 Add module keyword 2018-07-26 00:52:45 -07:00
greg cc3833754d Switch from request to superagent
For doing HTTP requests. Makes the js bundle a lot smaller.

Also I should do something about the fact that I now have to change the
js and also rebuild the rust binary to change code
2018-07-26 00:52:45 -07:00
greg 9afbd2305f Literal non-primitive values 2018-07-26 00:52:45 -07:00
greg d7564f81c9 Starting work on literal non-primitve values 2018-07-26 00:52:45 -07:00
greg 2fbb8f2b2f Can eval custom data constructors now 2018-07-26 00:52:45 -07:00
greg 1884eae191 Float literals, kill old code 2018-07-26 00:52:45 -07:00
greg bb880d44fa Some more primitive types + binop-checking 2018-07-26 00:52:45 -07:00
greg 22b4738726 Add required imports 2018-07-26 00:52:45 -07:00
greg 0202aab181 Some partial work on refactoring type infer fn 2018-07-26 00:52:45 -07:00
greg f9c9ed6b29 Add colored output to non-interactive 2018-07-26 00:52:45 -07:00
greg 04cb1616f7 Convert webapp to using included files 2018-07-26 00:52:45 -07:00
greg 5f1c46cb87 Fix type check macro to add symbol table 2018-07-26 00:52:45 -07:00
greg 0ea9bd3d95 More work with unification 2018-07-26 00:52:45 -07:00
greg 0cf56eea4f the evar table
TODO find a better way to represent this
2018-07-26 00:52:45 -07:00
greg ab53c5394e Unify work 2018-07-26 00:52:45 -07:00
greg f6c85951fe Move type-level func up 2018-07-26 00:52:45 -07:00
greg c530715671 Okay I am figuring things out about hindley-milner again 2018-07-26 00:52:45 -07:00
greg 617a30b967 rename type_var to ty 2018-07-26 00:52:45 -07:00
greg cd11d18385 String and () types 2018-07-26 00:52:45 -07:00
greg f82c6199c0 Change around some stuff 2018-07-26 00:52:45 -07:00
greg f75cd763f8 Change Variable to Value 2018-07-26 00:52:45 -07:00
greg 54c16f0190 Partial handling of user defined types 2018-07-26 00:52:45 -07:00
greg 8d8e3cd565 Starting to make unify actually work 2018-07-26 00:52:45 -07:00
greg 47975cf8f6 Convert unify to are types
b/c Type implements Clone
Maybe wanna kill this later for efficiency
2018-07-26 00:52:45 -07:00
greg ddd861fbea Have + do something different with strings
Needed to introduce polymorphism soon
2018-07-26 00:52:45 -07:00
greg 200d0f9867 Operator typing a little bit 2018-07-26 00:52:45 -07:00
greg 3e44bd3a18 Slight refactoring 2018-07-26 00:52:45 -07:00
greg e2a94280c2 Renamed all the type-related types 2018-07-26 00:52:45 -07:00
greg c5b3bafe43 Move some type checking code around 2018-07-26 00:52:45 -07:00
greg b417451536 Basic typing test 2018-07-26 00:52:45 -07:00
greg a0faed3603 String types 2018-07-26 00:52:45 -07:00
greg 83752a1c74 Some more type work 2017-10-10 01:04:19 -07:00
greg 66c7bbeb07 Floats, pathspec changes 2017-10-09 04:02:50 -07:00
greg ed8359bcd7 Store constant state, func/binding as value
on symbol table, instead of key
2017-10-09 02:38:33 -07:00
greg 996f75e15c A lot more type work 2017-10-09 02:26:59 -07:00
greg 30a54d997c Simplify symbol table code 2017-10-09 00:59:52 -07:00
greg 4bcbf1854a Use universal/existential type data structures 2017-10-09 00:36:54 -07:00
greg f2c6556c2a Use name TypeVariable 2017-10-09 00:22:42 -07:00
greg 9161e2751f (Janky) type inference for explicitly-type-annotated values 2017-10-08 23:45:38 -07:00
greg 60fc9fd7e1 Super-basic type inference working
with a bunch of assumptions and hard-coded values
2017-10-08 23:33:53 -07:00
greg 3b249045aa Call needs to accept a general argument 2017-10-08 23:02:03 -07:00
greg ff0e14d9a9 Rename params -> args in Call Expr 2017-10-08 22:52:05 -07:00
greg 8fe535597e Starting to actually do Hindley-Milner!! 2017-10-08 22:48:10 -07:00
greg 4bb8f82579 Make AST output red 2017-10-08 22:17:29 -07:00
greg 5cb8423ecc Beginning for expressions 2017-10-08 22:07:18 -07:00
greg 4032707dc9 Kill some comments 2017-10-08 21:26:47 -07:00
greg 1a8423535a Add test for function decl 2017-10-08 21:25:51 -07:00
greg 338981febe Changed function signatures around slightly 2017-10-08 21:21:02 -07:00
greg 6dff8b029e Function definitions expanded 2017-10-08 20:55:05 -07:00
greg df877830d3 Fixed tests 2017-10-08 19:39:41 -07:00
greg 40696b3cbd Rename TypeAnno to TypeName everywhere 2017-10-08 19:30:52 -07:00
greg 40a82d7e25 Tests for new type stuff
+ some renaming
2017-10-08 19:15:08 -07:00
greg c605f76059 More type work II 2017-10-08 19:03:02 -07:00
greg a6d71821b9 More type work I 2017-10-08 18:47:57 -07:00
greg c4f0331d1a Symbol table addition should be separate stage 2017-10-08 16:24:44 -07:00
greg b4054d7581 Impl blocks 2017-10-08 14:24:02 -07:00
greg 74d3828c71 Symbol table debug needs to happen before type check 2017-10-08 13:59:55 -07:00
greg bb57da564d Infrastructure to debug symbol table 2017-10-08 13:57:43 -07:00
greg 3f9ae5fac3 Symbol table accepts variables 2017-10-08 13:51:56 -07:00
greg 62edc7c996 type checking / symbol table stuff 2017-10-08 12:22:04 -07:00
greg e412fb9a89 Convert type-checking function type 2017-10-07 22:08:48 -07:00
greg 3a97401f61 Add symbol table insertion method skeleton 2017-10-07 21:57:51 -07:00
greg 87cfe854ac Tuple literals 2017-10-06 20:28:07 -07:00
greg 184a2ae03a Change syntax for type alias 2017-10-04 22:02:31 -07:00
greg 50ceb92d6b Move type-checking into a module 2017-10-04 02:07:30 -07:00
greg dd7736e32d Add some resources 2017-10-03 21:28:17 -07:00
greg 3025af3ded Starting on impls 2017-10-03 03:49:07 -07:00
greg 24608362a5 Added a non-interactive schala source file 2017-10-02 23:42:34 -07:00
greg c83df6fd84 refactor main code 2017-10-02 23:33:07 -07:00
greg 65f64ebcc2 Add source file suffix to trait 2017-10-02 23:07:05 -07:00
greg 00ee802fbd Clear up clutter in code from using std::process:: 2017-10-02 23:00:11 -07:00
greg c88d59401c Making main.rs more concise 2017-10-02 22:58:03 -07:00
greg 1aa4e3b942 Get rid of virtual machine code
Gonna implement this differently
2017-10-02 20:34:51 -07:00
greg abbbb34901 Some very basic evaluation stuff 2017-10-02 20:11:27 -07:00
greg 3ff4a34aeb kill some non-used variable warnings 2017-10-02 01:52:46 -07:00
greg 7430aebe63 Webapp actually does something 2017-10-02 00:15:39 -07:00
greg 9071846df3 Flex on web input 2017-10-02 00:04:33 -07:00
greg 29d307ff53 Made web app a bit more useful 2017-10-01 23:25:36 -07:00
greg 89482e5b5a Use a Result 2017-10-01 19:31:43 -07:00
greg 6435d5e958 Make eval output nicer 2017-10-01 19:29:05 -07:00
greg f6536e7ebd Evaluation stuff 2017-10-01 19:09:55 -07:00
greg c5feea4597 Add basics of compiler design to readme 2017-10-01 18:37:10 -07:00
greg c5cb223168 Super-minimal type-checking with just ints 2017-10-01 17:50:26 -07:00
greg d16a0c9380 Evaluation of literals 2017-10-01 12:55:28 -07:00
greg daf9878020 Kill some unused code 2017-10-01 00:50:13 -07:00
greg f825c87397 Type checking beginnings 2017-10-01 00:48:08 -07:00
greg 8d2a65b44e Starting eval framework 2017-09-30 23:30:02 -07:00
greg 6b9fee1aed Made handling parse errors nicer 2017-09-30 14:41:37 -07:00
greg d05f173dd3 Using delmited! macro for more things 2017-09-30 14:11:38 -07:00
greg e88a0f59b5 Made macro less complicatd 2017-09-30 13:46:50 -07:00
greg 90cf7db609 Use the delimiter macro in a few places
Made it capable of handling the strict <item> <delim> behavior necessary
for non-blocks, as well as teh loose behavior necessary for blocks,
added a test for a parse error.
2017-09-30 13:04:56 -07:00
greg 1ae9dbcba7 Tests for tuple type annotations 2017-09-30 01:14:49 -07:00
greg 9214f36c04 Tests for type annotations 2017-09-29 19:10:37 -07:00
greg 98169bd352 update test for new type anno format 2017-09-29 14:53:16 -07:00
greg d60cf99ab5 Parsing sorta works
Need to handle double >> bug
2017-09-29 14:10:49 -07:00
greg bb93d29beb Some type anno parsing work 2017-09-28 23:55:10 -07:00
greg c20f93e18c A test for type annotation 2017-09-28 00:45:36 -07:00
greg f48adbd9bf Made the tests work again 2017-09-28 00:06:08 -07:00
greg 9ad506fc78 Handling type annotations in the AST 2017-09-27 22:27:50 -07:00
greg 4c81c36d67 Parse type annotations
Not using them yet
2017-09-26 22:10:13 -07:00
greg 230f2dd7ff More match expr work 2017-09-21 16:00:14 -07:00
greg 1615269a7b Browser stuff 2017-09-21 09:25:39 -07:00
greg 1a8ba8d8a2 Requests! 2017-09-21 00:01:54 -07:00
greg d4aec19c71 web: CodeArea 2017-09-20 23:46:46 -07:00
greg 10b1864680 Version-control yarn.lock 2017-09-20 23:23:17 -07:00
greg 4831a24853 yarn build script + rocket passthrough route 2017-09-20 23:21:45 -07:00
greg 67ff21d408 Basic yarn + browserify + babeljs infrastructure
For webapp
2017-09-20 23:15:29 -07:00
greg 8b83d982c0 More work on match expression 2017-09-20 21:05:08 -07:00
greg 6bff7aac0d Match expressions
not done yet
2017-09-20 20:30:30 -07:00
greg 18fa160fed Move source files into directory 2017-09-19 22:36:48 -07:00
greg 7ac5846282 A tiny bit more work on the webapp 2017-09-19 22:11:05 -07:00
greg 88209d370d Add .schala_repl to gitignore 2017-09-19 22:11:05 -07:00
greg 0f9d2d76c4 Serve an actual file
Convert this to stdweb soon?
2017-09-19 22:11:05 -07:00
greg 006fd7d411 Trying webapp
Note this doesn't work yet
2017-09-19 22:11:05 -07:00
greg e3b236a15d If expressions 2017-09-19 22:11:04 -07:00
greg 68bbd62ab6 Make token debug nicer 2017-09-19 22:11:04 -07:00
greg e47a2c7241 Save REPL config to file 2017-09-19 22:11:04 -07:00
greg a8b77848b4 kill old EBNF in comments 2017-09-19 22:11:04 -07:00
greg 839731f2d1 Make op! macro part of binexp!/prefexp! macros
For tests
2017-09-19 22:11:04 -07:00
greg f51e1a3c47 make Operation a tuple-style struct 2017-09-19 22:11:04 -07:00
greg fc350cd03e Prefix operators 2017-09-19 22:11:04 -07:00
greg 8fe7c85b00 return keyword + idea for how to use it in for 2017-09-19 22:11:04 -07:00
greg b920fae93b variable bindings 2017-09-19 22:11:04 -07:00
greg 1981b74d89 Nicen up precedence-testing 2017-09-19 22:11:04 -07:00
greg 077ab8ddb8 Add trace to binexp parser 2017-09-19 22:11:04 -07:00
greg 9775bfc342 bool literals 2017-09-19 22:11:04 -07:00
greg 505d23a327 Parse . operator 2017-09-19 22:11:04 -07:00
greg 81c4566c2b Improved operator parsing 2017-09-19 22:11:04 -07:00
greg 8be757beca Some changes to EBNF grammar 2017-09-19 22:11:04 -07:00
greg 20c74953b5 Get rid of unimplemented! and panic! 2017-09-19 22:11:04 -07:00
greg d5c3227966 Parse string literals 2017-09-19 22:11:04 -07:00
greg fbeb101e7f make parse trace have newlines 2017-09-19 22:11:04 -07:00
greg 18c761a5b5 Wrap all parse methods in record-printing macro 2017-09-19 22:11:04 -07:00
greg 89cf101362 Refactoring prace trace infra 2017-09-19 22:11:04 -07:00
greg 66d10604ba make parse_method! macro more naturalistic 2017-09-19 22:11:04 -07:00
greg 565461e1db Show more useful information in parse tracing 2017-09-19 22:11:04 -07:00
greg 5ecd28d057 Print parse record in REPL as TraceArtifact 2017-09-19 22:11:04 -07:00
greg 6c5dbac406 Starting to add logic to track recursive descent calls 2017-09-19 22:11:04 -07:00
greg 5dd1cd79ff Parsing call expressions 2017-09-19 22:11:03 -07:00
greg dfc89e5060 Index expressions
This only partially works
2017-09-19 22:11:03 -07:00
greg 5871bf68de test for function parsing 2017-09-19 22:11:03 -07:00
greg 34b569eb5f Starting to parse formal params 2017-09-19 22:11:03 -07:00
greg 3220f0aec7 A random idea 2017-09-19 22:11:03 -07:00
greg 5810fb7961 type alias test 2017-09-19 22:11:03 -07:00
greg 1a3076a949 Change syntax of rc macro 2017-09-19 22:11:03 -07:00
greg 1d9e5edfba Rudimentary type stuff 2017-09-19 22:11:03 -07:00
greg 555d2a7ba5 Identifier tests 2017-09-19 22:11:03 -07:00
greg 291fb61c8d Parse identifiers
Some more complicted types of expression
2017-09-19 22:11:03 -07:00
greg 685b579fdd paren exprs 2017-09-19 22:11:03 -07:00
greg f72e77cbb6 Remove printlns 2017-09-19 22:11:03 -07:00
greg 4b5afef17e Added one more test 2017-09-19 22:11:03 -07:00
greg 5889998126 Precedence parsing
Using the Pratt parser algorithm (probably with some bugs as of yet).
Need to clean up the code some as well
2017-09-19 22:11:03 -07:00
greg c19946bb6d Additional test 2017-09-19 22:11:03 -07:00
greg d1301b30e6 Added infra for operators 2017-09-19 22:11:03 -07:00
greg f8287e42ce Binary literal test case 2017-09-19 22:11:03 -07:00
greg bd6bf2f4bb Parse binary literal 2017-09-19 22:11:03 -07:00
greg a6b336d84c type anno EBNF 2017-09-19 22:11:03 -07:00
greg deab74b992 Kill extraneous lines 2017-09-19 22:11:03 -07:00
greg 0755d42112 More parsing work 2017-09-19 22:11:03 -07:00
greg 117e0e38a8 Starting types 2017-09-19 22:11:03 -07:00
greg 3f1de5f60d Kill unused struct 2017-09-19 22:11:03 -07:00
greg c52fd4c73d Parse test 2017-09-19 22:11:03 -07:00
greg cac3ea86cf Import TokenType and Kw everywhere 2017-09-19 22:11:03 -07:00
greg 92ece39d5e Only IntLiteral
Signed/unsigned is via - operator
2017-09-19 22:11:03 -07:00
greg 14c09bb40c Float literals too 2017-09-19 22:11:03 -07:00
greg 0dabbc700b Concise-ify code 2017-09-19 22:11:03 -07:00
greg 741e5f7f9b Parsing basic numbers! 2017-09-19 22:11:03 -07:00
greg cfefceabf9 More infra
Don't want EOF after all
2017-09-19 22:11:02 -07:00
greg ea08f8cab8 More parse infra 2017-09-19 22:11:02 -07:00
greg 7d1c07c481 Parsing infrastructure 2017-09-19 22:11:02 -07:00
greg 7831cb8d8a Start parsing 2017-09-19 22:11:02 -07:00
greg 16d9e3eb60 Colored text for artifacts 2017-09-19 22:11:02 -07:00
greg 737dad6438 Added some tests
And commented out old tests for Maaru that don't compile
2017-09-19 22:11:02 -07:00
greg 74f8c16599 Fix bug with _ 2017-09-19 22:11:02 -07:00
greg a82f24a158 Kill import 2017-09-19 22:11:02 -07:00
greg 6459ad28e8 A few more keywords 2017-09-19 22:11:02 -07:00
greg 5e0c7e5a95 Add some things to test.schala
Still playing with syntax
2017-09-19 22:11:02 -07:00
greg 57d4222746 Operators, keywords largely working 2017-09-19 22:11:02 -07:00
greg 88d1896281 Identifiers and keywords 2017-09-19 22:11:02 -07:00
greg 7fe0a6589e Unclosed string 2017-09-19 22:11:02 -07:00
greg ac5bdd7bcb Change some func signatures around tokenizing and errors 2017-09-19 22:11:02 -07:00
greg 8bf5f40a2a Some string tokenizing - not done 2017-09-19 22:11:02 -07:00
greg 7e505dd88e Stuff 2017-09-19 22:11:02 -07:00
greg f15427e5d9 A bunch of token stuff 2017-09-19 22:11:02 -07:00
greg 8230b115de Add test.schala file
With some syntax ideas
2017-09-19 22:11:02 -07:00
greg a53135a897 More elaborate tokens 2017-09-19 22:11:02 -07:00
greg f3c8474c93 Add help entry
Would like to make this generalizeable
2017-09-19 22:11:02 -07:00
greg 8dc8d15437 Cleaned up Repl struct 2017-09-19 22:11:02 -07:00
greg b5a6c5903e Switch to contentful output types 2017-09-19 22:11:02 -07:00
greg c97e58c2aa Cleared out all remaining linter warnings 2017-09-19 22:11:02 -07:00
greg cb9b56f000 Added back compilation 2017-09-19 22:11:02 -07:00
greg 55e1600b97 Kill old trait 2017-09-19 22:11:02 -07:00
greg fb009497a4 Still more cleanup 2017-09-19 22:11:02 -07:00
greg 4b13fef734 More cleanup 2017-09-19 22:11:02 -07:00
greg 14ccf9f1be Converted Robo to new style trait 2017-09-19 22:11:02 -07:00
greg 7a6dfbbd0e Deleting old code 2017-09-19 22:11:02 -07:00
greg bb3f85dd16 Getting rid of old code for maaru 2017-09-19 22:11:02 -07:00
greg 3e66568ddd Converted over Maaru to new schema
-partially...
2017-09-19 22:11:02 -07:00
greg 3abe299361 More work on new trait structure 2017-09-19 22:11:02 -07:00
greg 626b17cbd2 Idea for trait redesign 2017-09-19 22:11:02 -07:00
greg 192a7e611f Parsing BNF 2017-09-19 22:11:01 -07:00
greg d3febb201b More parsing 2017-09-19 22:11:01 -07:00
greg f9fe81993f Beginning parsing code 2017-09-19 22:11:01 -07:00
greg ff01d4b798 Initial Schala (for real) commit 2017-09-19 22:11:01 -07:00
greg dd22ca0291 Grand renaming of things 2017-09-19 22:11:01 -07:00
greg 801896bcc6 Starting to add code for vm 2017-09-19 22:11:01 -07:00
greg 6f8c89af37 Change wording 2017-09-19 22:11:01 -07:00
greg f82e1e699c Rust, not Haskell 2017-09-19 22:11:01 -07:00
greg a0f3583ab1 Forgot newline 2017-09-19 22:11:01 -07:00
greg e81d5e108b Improved README 2017-09-19 22:11:01 -07:00
greg 5d15d60ab6 Structs implemented
albeit very inefficiently
2017-09-19 22:11:01 -07:00
greg f0de3c3d12 Fix this source file 2017-09-19 22:11:01 -07:00
greg 9dd8f90e3c Only print last evaluated result 2017-09-19 22:11:01 -07:00
greg e0f5f01e69 Kill error messges for not using Result 2017-09-19 22:11:01 -07:00
greg 424998c128 Lists work! 2017-09-19 22:11:01 -07:00
greg b93625819c Update ReducedValue to handle lists 2017-09-19 22:11:01 -07:00
greg f90bfb88ca Fix display of list 2017-09-19 22:11:01 -07:00
greg 850b77541b Display of lists sorta works 2017-09-19 22:11:01 -07:00
greg dbf5886aad List evaluation technically working 2017-09-19 22:11:01 -07:00
greg dd93adf5b7 try!() -> ? 2017-09-19 22:11:01 -07:00
greg d8df98ba01 Beginnings of list literals 2017-09-19 22:11:01 -07:00
greg 4da771036a Part of evaluation path for indexing done 2017-09-19 22:11:01 -07:00
greg 3911c45dde Introduced index notation 2017-09-19 22:11:01 -07:00
greg f3c3d4595e Immediate lambda call 2017-09-19 22:11:01 -07:00
greg e4a42e7691 Add back eval printing 2017-09-19 22:11:01 -07:00
greg cc537f292d Starting Maaru AST 2017-09-19 22:11:01 -07:00
greg 840e093bc4 Maaru - token work 2017-09-19 22:11:01 -07:00
greg 815f2b8242 Starting work on Maaru tokens 2017-09-19 22:11:01 -07:00
greg 34dba9cc4d Schala - fix bug with comments 2017-09-19 22:11:01 -07:00
greg 6e28ae68a0 Add options 2017-09-19 22:11:01 -07:00
greg 48b0b8d053 Add logic for picking language with command line flags 2017-09-19 22:11:01 -07:00
greg e0c49abe56 Change show-llvm opt to -v 2017-09-19 22:11:01 -07:00
greg 65dc362a1d Killed some warnings, cleaned up some code 2017-09-19 22:11:01 -07:00
greg 8ff1c632c2 Make REPL friendlier 2017-09-19 22:11:01 -07:00
greg 039022bfc5 Get rid of println 2017-09-19 22:11:01 -07:00
greg 387ec25cda Fix bugs in interpreter argument parsing 2017-09-19 22:11:01 -07:00
greg ecf60198fa Can now switch between languages in the interpreter 2017-09-19 22:11:01 -07:00
greg f83cece2b4 Import Maaru into main 2017-09-19 22:11:00 -07:00
greg 8fd5fb5a0b Added language name functionality 2017-09-19 22:11:00 -07:00
greg 455fe2abe2 Get rid of stand alone evaluator 2017-09-19 22:11:00 -07:00
greg 902c85ccd7 Fully implemented state
If I make the LanguageInterface trait over a pair of language and
evaluator, then it works :)
2017-09-19 22:11:00 -07:00
greg 4ea600d55c Abstracted most work into LanguageInterface trait
Still need to handle state
2017-09-19 22:11:00 -07:00
greg 6dec35d460 Think I've nearly gotten it traitified correctly... 2017-09-19 22:11:00 -07:00
greg cc855affbf Make Maaru structs public 2017-09-19 22:11:00 -07:00
greg a303aa2a5b Add first new language - Maaru
Maaru is intended to be a haskell-ish functional language.
Here's enough of a skeleton to print a thing
2017-09-19 22:11:00 -07:00
greg 421a9a7e9b Abstract evaluation into EvaluationMachine trait 2017-09-19 22:11:00 -07:00
greg f37ab80163 Fix tests 2017-09-19 22:11:00 -07:00
greg 178434171e Cleaning up some types 2017-09-19 22:11:00 -07:00
greg fd4610e175 Make newtype for LLVM code strings 2017-09-19 22:11:00 -07:00
greg 5103f03fa5 Forgot to add mod.rs file 2017-09-19 22:11:00 -07:00
greg 1a4bf24ab1 Move schala-specific stuff into its own module 2017-09-19 22:11:00 -07:00
greg 9d6bdf22da More conversions to trait version 2017-09-19 22:11:00 -07:00
greg 8326a12c9c (Largely) trait-ify Schala
The idea is to provide a trait `ProgrammingLanguage` that actually does
all the work, and then main.rs will just be the infra for storing its
own state
2017-09-19 22:11:00 -07:00
greg 5e474231da ProgrammingLanguage types need Debug 2017-09-19 22:11:00 -07:00
greg 1ac440c8df Implement trait parse 2017-09-19 22:11:00 -07:00
greg f5022a771c Starting work to trait-ify language 2017-09-19 22:11:00 -07:00
greg eaf86ea908 Add support for +, - in num literals 2017-09-19 22:11:00 -07:00
greg eb6354e55a Only print errors if the programs failed 2017-09-19 22:11:00 -07:00
greg 751c6f65bd Deleted some code in compilation 2017-09-19 22:11:00 -07:00
greg 3e231b4715 Use native rust to write source file 2017-09-19 22:11:00 -07:00
greg e103ba221c Conditionals work! 2017-09-19 22:11:00 -07:00
greg d5f01a7b1f Continuing work on phi nodes 2017-09-19 22:11:00 -07:00
greg 1702163478 Add flag for llvm 2017-09-19 22:11:00 -07:00
greg bdd6f75cf6 Show/hide LLVM IR in REPL 2017-09-19 22:11:00 -07:00
greg 2681dbc4f2 Add test for "a+4" being conterintuitive
Also fix Op -> BinOp in tests
2017-09-19 22:11:00 -07:00
greg 9454fc6194 Add return statements to generated functions 2017-09-19 22:11:00 -07:00
greg c8feaa9b57 Add logging of supplimental commands 2017-09-19 22:11:00 -07:00
greg 518414ffd5 I was doing a wrong thing with creating vecs
The old vector was getting dropped and thus free-ing the old
underlying slice. I want to use set_len() on Vec to do
this
2017-09-19 22:11:00 -07:00
greg 06a5de6e32 Trying to debug this segfault 2017-09-19 22:11:00 -07:00
greg dd4816624c Change name of project to Schala 2017-09-19 22:11:00 -07:00
greg 748a85db02 Compiling functions half-works 2017-09-19 22:11:00 -07:00
greg 8f2d9b900b Function codegen sorta works 2017-09-19 22:11:00 -07:00
greg b9d1140264 Refactored op compilation code
+ moved to separate function
2017-09-19 22:11:00 -07:00
greg 7188a7d33e Clarified that we hardcode a "main" function
in compiler data structure
2017-09-19 22:10:59 -07:00
greg 0c7099771f Comparison operator working 2017-09-19 22:10:59 -07:00
greg afec7e829c There's some segfault happening in LLVMBuildUIToFP 2017-09-19 22:10:59 -07:00
greg a6773d59bd Refactoring in compiling binops 2017-09-19 22:10:59 -07:00
greg d804efdc5e Use BinOp type instead of strings 2017-09-19 22:10:59 -07:00
greg 0ace370fc2 Tightened up REPL loop 2017-09-19 22:10:59 -07:00
greg 1f50fcc620 Improvments to interpreter directives parsing 2017-09-19 22:10:59 -07:00
greg d7181afa91 Few more linefeed-related changes 2017-09-19 22:10:59 -07:00
greg 4eb7683f47 Move linefeed reader to struct 2017-09-19 22:10:59 -07:00
greg b04a8f0092 Add back interpreter directives 2017-09-19 22:10:59 -07:00
greg c50be58cd2 Moved from simplerepl to lineread crate 2017-09-19 22:10:59 -07:00
greg 5911a07f4f Inline parsing of lambdas like half-works 2017-09-19 22:10:59 -07:00
greg 26bc6e90f3 Lamba calls partially work 2017-09-19 22:10:59 -07:00
greg b0655d7cab need to flush stdout for printing 2017-09-19 22:10:59 -07:00
greg a46ede9395 Made evaluation-printing more sophisticated 2017-09-19 22:10:59 -07:00
greg d9ab5a58cf Add some methods to llvm_wrap 2017-09-19 22:10:59 -07:00
greg 77297c7e06 Add lambdas 2017-09-19 22:10:59 -07:00
greg d93b5c0a2e Still cranking away at conditional compilation 2017-09-19 22:10:59 -07:00
greg 0b9dc113d1 CLoser to working now 2017-09-19 22:10:59 -07:00
greg d6fc13f08d Fix a couple of problems 2017-09-19 22:10:59 -07:00
greg 825c271b17 More work on codegen for conditionals
Still doesn't compile
2017-09-19 22:10:59 -07:00
greg 8c4f7e141a Compiling if statements like half done 2017-09-19 22:10:59 -07:00
greg 12fbc51da1 Compile multi-expression source programs 2017-09-19 22:10:59 -07:00
greg db108ee434 Unicode should work 2017-09-19 22:10:59 -07:00
greg 7ddb421ced Exit cleanly on opt parse fail 2017-09-19 22:10:59 -07:00
greg 1631bb0a04 Fix tests for conditionals 2017-09-19 22:10:59 -07:00
greg 5923cc2317 Kill then, else keywords 2017-09-19 22:10:59 -07:00
greg 1fa56800c5 Convert parsing while, if, fn exprs to use { } 2017-09-19 22:10:59 -07:00
greg 2b4d3e8516 Add support for curly braces and brackets
Gonna make this a curly-brace language, I like those better. Shoulda
done that to begin with.
2017-09-19 22:10:59 -07:00
greg 9b74527618 Control printing eval steps with flags 2017-09-19 22:10:59 -07:00
greg d23e5bff35 Add an Op type for binop operators
Soon this will get swapped in as the way that BinOps are evaluated
2017-09-19 22:10:59 -07:00
greg 3a4f5ae840 Change name Op -> OpTok
So that I can make an Op type for the ASTNode
2017-09-19 22:10:59 -07:00
greg 298194c42d Finish support for assignment operators 2017-09-19 22:10:59 -07:00
greg 23d2209d8b Implementing a few more operators
WIP - not done
2017-09-19 22:10:59 -07:00
greg 4cf165b408 Use buffered reader for stdout
Not sure ifthis is actually helping
2017-09-19 22:10:59 -07:00
greg 154839979b Add nicer handle_builtin method 2017-09-19 22:10:59 -07:00
greg 6741787852 Update references in README 2017-09-19 22:10:59 -07:00
greg 538f0b18f4 Evaluate while loop 2017-09-19 22:10:59 -07:00
greg dc81d237c5 Reduce re-allocations in eval 2017-09-19 22:10:58 -07:00
greg 8651839a66 Getting rid of some newlines - concision 2017-09-19 22:10:58 -07:00
greg f6e5ea250d Convert while_expr to delimiter_block! too 2017-09-19 22:10:58 -07:00
greg 9801f53a17 Moved conditionals to delimiter_block! syntax 2017-09-19 22:10:58 -07:00
greg db92292569 Fixed all tests 2017-09-19 22:10:58 -07:00
greg e1ce54aece Add delimiter_block macro 2017-09-19 22:10:58 -07:00
greg c227ad656f Parser simplifications, renames, etc. 2017-09-19 22:10:58 -07:00
greg b45d09e81a Don't need this reference 2017-09-19 22:10:58 -07:00
greg 761500b9d6 Some cleanups in Parser
-get rid of some use statements
-mkae error messages better
2017-09-19 22:10:58 -07:00
greg e888e82404 Remove some unnecessary destructurings of Rc<String> 2017-09-19 22:10:58 -07:00
greg 328ec4ba87 Convertd like half the Strings to RC
-still need to eliminate some clones in eval, parse
+ fix all the tests
2017-09-19 22:10:58 -07:00
greg 4a7b570603 Parser changes - add precedences, move definitions
Move impls of Display for AST subtypes closer to where they are defined
2017-09-19 22:10:58 -07:00
greg 7eb48fb4ef Working on compilation again 2017-09-19 22:10:58 -07:00
greg 8ebf1b3056 Add parser support for while statements 2017-09-19 22:10:58 -07:00
greg 905431b33c Change name: ASTNode -> Statement 2017-09-19 22:10:58 -07:00
greg 2996198eff lookup_binding only needs &str 2017-09-19 22:10:58 -07:00
greg 06771979df Function bodies can contain statements now 2017-09-19 22:10:58 -07:00
greg f158b6c712 Converted to multiple-evaluator logic
Now I have (basically) full single-step evaluation and it works fine
2017-09-19 22:10:58 -07:00
greg ba8f67441f Conditionals - handle delimiters correctly 2017-09-19 22:10:58 -07:00
greg 872e9ce7ee Make function binding a SideEffect 2017-09-19 22:10:58 -07:00
greg 2722533efd Prove recursion works 2017-09-19 22:10:58 -07:00
greg edf342e65a Add == operator 2017-09-19 22:10:58 -07:00
greg 27d4c2ccbd No references in pattern-matching 2017-09-19 22:10:58 -07:00
greg 6794d22f1d Run rustfmt on eval.rs, parser.rs 2017-09-19 22:10:58 -07:00
greg 1858d26638 Add comparison operators
+  make operator evaluation more concise
2017-09-19 22:10:58 -07:00
greg 84fbe73cf6 Add Lambda type
And change name FuncNode -> FuncDefNode

Now function definition nodes reduce to a Lambda, which is not
reducible.
2017-09-19 22:10:58 -07:00
greg ad994c38ae Test simplification 2017-09-19 22:10:58 -07:00
greg 48343d3fad Tightened tokenization tests 2017-09-19 22:10:58 -07:00
greg 4f8ff35d0f fixed bug with ends_identifier 2017-09-19 22:10:58 -07:00
greg b210ad5e19 Added links to README 2017-09-19 22:10:58 -07:00
greg 7311d0311f Simplify pattern a little bit 2017-09-19 22:10:58 -07:00
greg 1b59c264b4 Use itertools peeking_take_while
Cuts down on lines in the tokenizer
2017-09-19 22:10:58 -07:00
greg b2e453a9de Rewrite of tokenizer 2017-09-19 22:10:58 -07:00
greg 59226eb731 Ran rustfmt on parser.rs 2017-09-19 22:10:58 -07:00
greg 297003c0b0 Operator only needs to be a tuple struct 2017-09-19 22:10:58 -07:00
greg e5ee072b00 Convert tokenizer to large match statement
In the hopes of making it shorter
2017-09-19 22:10:58 -07:00
greg 9b62efc830 Fix conditional parsing
Needed to account for semicolons/newlines. Maybe need to generalize
delimiter-separated list of things
2017-09-19 22:10:58 -07:00
greg f626ca1427 Add test for conditional parsing 2017-09-19 22:10:57 -07:00
greg 82c52ede48 Finish evaluating conditionals 2017-09-19 22:10:57 -07:00
greg 7f52b20d97 Conditional eval partway implemented
Need to change the AST representation to support compound statements I
think
2017-09-19 22:10:57 -07:00
greg 12fee6158c Vector'd expressions don't need to be boxed 2017-09-19 22:10:57 -07:00
greg 2d21de7cc3 Added support for conditionals to parser
Not to eval yet
2017-09-19 22:10:57 -07:00
greg 9cc9c5977d Fixed evaluation of function calls
This bit still isn't quite small-step but maybe that's okay for
functions
2017-09-19 22:10:57 -07:00
greg ed9d1312d1 Change representation of variables 2017-09-19 22:10:57 -07:00
greg 29d9e50311 All environment changes represented explicitly
Start the work of rewriting the evluator to be a true small-step
evaluator - that is, all state changes are represented explicitly as
SideEffect types, and not as methods called on the evaluator, except at
the very top of the evaluation loop
2017-09-19 22:10:57 -07:00
greg e84550f3ec Kill two compilation warnings 2017-09-19 22:10:57 -07:00
greg 3063de1242 Run rustfmt on the rest of them 2017-09-19 22:10:57 -07:00
greg e1d07b4e66 Rustfmt on llvm_wrap 2017-09-19 22:10:57 -07:00
greg af45004afa Run rustfmt on some files 2017-09-19 22:10:57 -07:00
greg d4d61ce5ad Use process::exit in main 2017-09-19 22:10:57 -07:00
greg 743311d18a Convert Tokenize to Result 2017-09-19 22:10:57 -07:00
greg 17f9846bb9 Make tests pass w/ new match syntax 2017-09-19 22:10:57 -07:00
greg fe8418edbe Kill some extraneous spaces 2017-09-19 22:10:57 -07:00
greg 1d8102b9fa Make compilation controllable 2017-09-19 22:10:57 -07:00
greg aac3ca40fe Add getopts 2017-09-19 22:10:57 -07:00
greg c389b44bf8 Compile statements with variables 2017-09-19 22:10:57 -07:00
greg db52f9b716 Package compilation context into one struct 2017-09-19 22:10:57 -07:00
greg d3a743442b = should have high precedence 2017-09-19 22:10:57 -07:00
greg e46d840d96 Move more code to llvm_wrap
And silence a few compiler warnings
2017-09-19 22:10:57 -07:00
greg 3d406f1dd2 Move llmv wrapper into separate file 2017-09-19 22:10:57 -07:00
greg fca307d3ab Package more functions in LLVMWrapper 2017-09-19 22:10:57 -07:00
greg a65544356c Fix bug in delete routine 2017-09-19 22:10:57 -07:00
greg 10d51cc29c Compile binexps 2017-09-19 22:10:57 -07:00
greg 95b773de7f Compilation sequence
-move all steps of the llvm IR compilation process into the binary
2017-09-19 22:10:57 -07:00
greg c032da712f Successfully compiling a main function 2017-09-19 22:10:57 -07:00
greg cfb93d9dea Compilation that returns an output code works 2017-09-19 22:10:57 -07:00
greg 08798fb690 Move all LLVM functions into LLVMWrap
so I can kill the global unsafe
2017-09-19 22:10:57 -07:00
greg e0eef5e58f More moving functions around 2017-09-19 22:10:57 -07:00
greg 2107e6344e More function wrapping 2017-09-19 22:10:57 -07:00
greg 2ad65a1c5d Some in-progress work 2017-09-19 22:10:57 -07:00
greg a01b6c874e Small-step function arg evaluation 2017-09-19 22:10:57 -07:00
greg 907af38f44 Evaluate printing 2017-09-19 22:10:57 -07:00
greg 923566c4e9 Get rid of println's for token/ast debugging
Instead just explicitly stick them into the returned string.

This is necessary 'cause I'm gonna convert simplerepl to use ncurses
soon, so I can't have any side effects
2017-09-19 22:10:57 -07:00
greg 96c51a9b88 Move some more unsafe code into block 2017-09-19 22:10:57 -07:00
greg 8528c912bd Added side effect framework 2017-09-19 22:10:57 -07:00
greg 47f4c25020 Doing some wrapping 2017-09-19 22:10:56 -07:00
greg 6ed67cd325 Add test .schala file 2017-09-19 22:10:56 -07:00
greg c18c1a639a compilation does something usefl
Following along with:
http://blog.ulysse.io/2016/07/03/llvm-getting-started.html
2017-09-19 22:10:56 -07:00
greg d3f6bbabaa Starting to add actual LLVM generating code 2017-09-19 22:10:56 -07:00
greg 5d0648b9c0 Ignore Cargo.lock 2017-09-19 22:10:56 -07:00
greg ba7a2e658f Don't track Cargo.lock 2017-09-19 22:10:56 -07:00
greg ee12e10ac6 Fix slice syntax 2017-09-19 22:10:56 -07:00
greg 6a0b50f278 Kill hashmap import for now 2017-09-19 22:10:56 -07:00
greg 6c44d295db Framework for compilation 2017-09-19 22:10:56 -07:00
greg cd69ebaa9d Split main() into subfunctions 2017-09-19 22:10:56 -07:00
greg 3fe9ec95d5 Fix repl error message 2017-09-19 22:10:56 -07:00
greg 8b5d1ecd15 Kill compilation for now
Don't wanna deal with this right this second
2017-09-19 22:10:56 -07:00
greg 272fa92052 More fleshing out 2017-09-19 22:10:56 -07:00
greg 0c717c721e Import Function type 2017-09-19 22:10:56 -07:00
greg 64d560a1fc Still more work 2017-09-19 22:10:56 -07:00
greg 8a92d5ffa8 Partial LLVM work 2017-09-19 22:10:56 -07:00
greg 5ed0ed727b Starting to add LLVM code 2017-09-19 22:10:56 -07:00
greg 5428810d2c Add llvm dependencies 2017-09-19 22:10:56 -07:00
greg b48c007bca Get rid of old evaluator file
No longer need it
2017-09-19 22:10:56 -07:00
greg 5aa4c404a5 Added conditionals to grammar 2017-09-19 22:10:56 -07:00
greg 0e3aaa8b08 Add conditional expression support 2017-09-19 22:10:56 -07:00
greg f33cfdadfe Fix variable lookup order
Evaluator uses frames as a stack, so to find the most-recent variable
binding we have to iterate through it the reverse way. This fixes a bug
manifested by:

fn a(x)
 x + 20
end

fn b(x)
 a(x) + 20
end
2017-09-19 22:10:56 -07:00
greg 231de69084 Added infrastructure for reading files 2017-09-19 22:10:56 -07:00
greg f014c1a9d9 Kill compiler warnings 2017-09-19 22:10:56 -07:00
greg c36bc3377e Fix lookup_binding in function call case 2017-09-19 22:10:56 -07:00
greg 5eba222679 Variable binding in fucntion call work 2017-09-19 22:10:56 -07:00
greg dcf89aa429 Move lookup_function back onto Evaluator
THe problem was that we were borrowing the output of the inner HashMap,
if we clone it we don't borrow anything
2017-09-19 22:10:56 -07:00
greg 1ffbeb6472 Return last value out of function 2017-09-19 22:10:56 -07:00
greg f8a521fc9b Start making function calls work
This commit isn't fully done yet
2017-09-19 22:10:56 -07:00
greg 77f72806be Add support for multiple frames 2017-09-19 22:10:56 -07:00
greg dcde5d6018 Start function call reduction
Move the varmap and funcmap functions to the Evaluator, to mkae it
easier to facilitate separate frames
2017-09-19 22:10:56 -07:00
greg 6bb227d052 Rename methods
make reduce() the entry point to evaluation
2017-09-19 22:10:56 -07:00
greg 29d4cb53a4 Add infrastructure to do function evaluation
Right now you can successfully evaluate a function definition (returning
Null), but cannot call a function
2017-09-19 22:10:56 -07:00
greg 3915c1f035 Remove println from reduction 2017-09-19 22:10:56 -07:00
greg b400796e4d Get rid of test variable input 2017-09-19 22:10:56 -07:00
greg 96e6a87f64 Add string concat 2017-09-19 22:10:56 -07:00
greg b1f9e5cefc Fix variable lookup 2017-09-19 22:10:56 -07:00
greg be36d4697d Pretty-print evaluated AST nodes 2017-09-19 22:10:56 -07:00
greg ce8c511929 Evaluate additional operators 2017-09-19 22:10:55 -07:00
greg a8cafa8c64 Move Evaluator state into interpreter state 2017-09-19 22:10:55 -07:00
greg 229e6ae733 More expression parsing work 2017-09-19 22:10:55 -07:00
greg e9dd0d9ae8 Add concept of Null expression
Finally, the null-only behavior is starting to manifest itself!
2017-09-19 22:10:55 -07:00
greg 15d4317191 Add null expression 2017-09-19 22:10:55 -07:00
greg 7114e446a4 Adding, subtracting works 2017-09-19 22:10:55 -07:00
greg 044f534ac5 Start implementing variable lookup 2017-09-19 22:10:55 -07:00
greg d3207ad890 Move evaluation logic back into methods
They need to be able to access the environment which is stored in the
Evalator struct
2017-09-19 22:10:55 -07:00
greg 19fffd5063 Variable binding infrastructure 2017-09-19 22:10:55 -07:00
greg 785c916ece Start reducing ASTs
Start writing code to reduce AST nodes
q
2017-09-19 22:10:55 -07:00
greg 5a9ebb188d Make Evaluable trait 2017-09-19 22:10:55 -07:00
greg 16e8d969be Add basic evaluation 2017-09-19 22:10:55 -07:00
greg 70bf68d9bd More concision in parser 2017-09-19 22:10:55 -07:00
greg f53c14535b Made error! macro more programtic
TODO implement Display on Token so we're not just displaying the debug
name of the token enum variants
2017-09-19 22:10:55 -07:00
greg 4f96abd7d9 Changes to make the code more concise 2017-09-19 22:10:55 -07:00
greg fdaf4c302c Fix all compiler warnings 2017-09-19 22:10:55 -07:00
greg 8ce53d7c72 Fix bind error 2017-09-19 22:10:55 -07:00
greg 428d560e2a Add tests for call expr parsing 2017-09-19 22:10:55 -07:00
greg 80bc7ec089 Proper call expression parsing 2017-09-19 22:10:55 -07:00
greg e6591b80d9 Add paren test 2017-09-19 22:10:55 -07:00
greg e099f713ad Add binop parsing test 2017-09-19 22:10:55 -07:00
greg 9b54256521 Import types for breivty
and rename function to be explicit
2017-09-19 22:10:55 -07:00
greg 087402ece6 Add more tests
Need to use box patterns
2017-09-19 22:10:55 -07:00
greg 252b6e8bd9 Okay, this strategy makes the test work 2017-09-19 22:10:55 -07:00
greg b1163e2ae4 Operator-precedence parsing + tests
The tests are crippled now, because it's hard to write a test macro that
can also match on Strings
2017-09-19 22:10:55 -07:00
greg 032d01c9f5 Fix tokenization bug 2017-09-19 22:10:55 -07:00
greg c4ab1ed105 Fix tokenizer tests 2017-09-19 22:10:55 -07:00
greg 9a257f08d7 Introduce Op type
For operator parsing
2017-09-19 22:10:55 -07:00
greg 47d56a7b44 fix operator parsing 2017-09-19 22:10:55 -07:00
greg 1f7ae2e30f Paren expression 2017-09-19 22:10:55 -07:00
greg e3c8753a4d Expression parsing 2017-09-19 22:10:55 -07:00
greg e1aa7ecb17 Finish tokenizing Op separately 2017-09-19 22:10:55 -07:00
greg f09a6e14ba Tokenizer work to support operators
work in progress but committing to transfer
2017-09-19 22:10:55 -07:00
greg 31da25a66e Expression parsing work 2017-09-19 22:10:55 -07:00
greg fff9cb7d25 Fix functin parsing 2017-09-19 22:10:55 -07:00
greg db1e188fdb Move grammar to top of file 2017-09-19 22:10:55 -07:00
greg 935185ed92 more parsing 2017-09-19 22:10:55 -07:00
greg 0999cbe28e More parsing work 2017-09-19 22:10:55 -07:00
greg 674f70a428 Convert parsing to method-based 2017-09-19 22:10:54 -07:00
greg b1b6672399 Implement function parsing
With a lot of dummy code, especially around expression parsing
2017-09-19 22:10:54 -07:00
greg bd1c455dc8 Basic infrastructure parses
Also got rid of EOF, don't need it
2017-09-19 22:10:54 -07:00
greg b62ef43f07 Add basic BNF grammar 2017-09-19 22:10:54 -07:00
greg 09b67dc3f7 Change error message 2017-09-19 22:10:54 -07:00
greg a613fa73e5 Basic parsing framework 2017-09-19 22:10:54 -07:00
greg 8c473c554e Fix bug 2017-09-19 22:10:54 -07:00
greg 3e04cbfa29 Add comma tokenization 2017-09-19 22:10:54 -07:00
greg 570650cbfa Finish keyword tokenization 2017-09-19 22:10:54 -07:00
greg 49be163181 Add test to ignore
For better handing of user-defined operators, which I will do in the
future
2017-09-19 22:10:54 -07:00
greg b4f93acbd8 Couple more tests 2017-09-19 22:10:54 -07:00
greg 8c65ae3214 Macro-ize token tests 2017-09-19 22:10:54 -07:00
greg e436533638 Passing test 2017-09-19 22:10:54 -07:00
greg 4601a56867 Start working on tokenization tests 2017-09-19 22:10:54 -07:00
greg 2f7a1850db Finish tokenizing 2017-09-19 22:10:54 -07:00
greg 71aef379d3 Tokenize number literals
TODO: expand this bit of code to handle 0x12, etc. syntax
2017-09-19 22:10:54 -07:00
greg 8662a3ba0e Make tokenize error-able 2017-09-19 22:10:54 -07:00
greg 5ca98c7d77 Print tokeniziation 2017-09-19 22:10:54 -07:00
greg 13cde3106c Start making tokenizer changes
Hopefully this time iron out all the bugs from the last implementation
2017-09-19 22:10:54 -07:00
greg 09d524c74a Changing how parsing works again
Someone wrote a port of the LLVM kaleidoscope tutorial to rust, namely
https://github.com/jauhien/iron-kaleidoscope

I'm just gonna follow this along
2017-09-19 22:10:54 -07:00
greg 61c36c4def Fix assign parsing
= is a keyword not an identifier
2017-09-19 22:10:54 -07:00
greg bc4fbe4276 Start implementing definition
WIP, doesn't work
2017-09-19 22:10:54 -07:00
greg fc11ee753d Add block parsing
Right now evaluating a block reduces it to just the last AST in it, will
fix later with environments
2017-09-19 22:10:54 -07:00
greg dd2b4893a4 Get rid of Separator token
Have separate newline and semicolon tokens
2017-09-19 22:10:54 -07:00
greg 51745fd800 Make convenience macro for parse errors 2017-09-19 22:10:54 -07:00
greg 8c0ac19fa8 Add full test
Test evaluate, tokenize, parser all at once
2017-09-19 22:10:54 -07:00
greg 971ab9ba21 Implement paren expressions 2017-09-19 22:10:54 -07:00
greg 819fb3f58f Basic evaluator functionality
Interpreter now works for simple arithmetic expressions
2017-09-19 22:10:54 -07:00
greg 1c23329656 Add boilerplate for evaluation
Just wires everything up, doesn't actually evaluate yet
2017-09-19 22:10:54 -07:00
greg 6da20cbfaf Address compiler warnings 2017-09-19 22:10:54 -07:00
greg f48451125e Parsing arithmetic expressions works
At the expense of an unnecessary move in lookahead()
2017-09-19 22:10:54 -07:00
greg 06c3999430 Add expect_identifier function
For utility
2017-09-19 22:10:54 -07:00
greg 1af1589550 Make Parser class internally use IntoIterator
And wrap the next() and peek() methods
2017-09-19 22:10:54 -07:00
greg 32a90b8103 write expect
and also make the ParseResult type more general so it can be used for
more things.
2017-09-19 22:10:54 -07:00
greg 186c900920 Implement expect macro
Seems like there's no way around passing in self manually
2017-09-19 22:10:54 -07:00
greg 509ab80b9c I can now parse one thing 2017-09-19 22:10:54 -07:00
greg 247638c4db Get parsing REPL working 2017-09-19 22:10:54 -07:00
greg be98f8387e Add another test 2017-09-19 22:10:54 -07:00
greg 841b38d5b1 Add tokenization test 2017-09-19 22:10:53 -07:00
greg 16dfbb27d5 Use state features from simplerepl 2017-09-19 22:10:53 -07:00
greg 3af7e6a409 Refactoring Schala
Gonna work on Schala in earnest now! Using the simplerepl crate instead
of a build-in REPL, temporarily dropping parsing and evaluation code.
2017-09-19 22:10:53 -07:00
greg 123f388711 Rename language to "Schala" 2017-09-19 22:10:53 -07:00
greg bb349cda5f Fix newline tokenizing bug
Still need to fix <statements> parsing because of the final newline
2017-09-19 22:10:53 -07:00
greg caa331ecdc Read from file as well as repl 2017-09-19 22:10:53 -07:00
greg ae3a030ad8 Use iterative semantics for reduction
Becuase if you try to make reduce() recursive you blow out the stack.

Incidentally, "let a = 0; while a < 999999; let a = a + 1 end" is a neat
thing to try at this stage of the game
2017-09-19 22:10:53 -07:00
greg ddb09b453d Implement booleans 2017-09-19 22:10:53 -07:00
greg 9a4760d44f Implement while-reduction 2017-09-19 22:10:53 -07:00
greg eb5ce2ef9e Don't print newline for empty REPL result 2017-09-19 22:10:53 -07:00
greg b080ea7c81 Add if-statement evaluation 2017-09-19 22:10:53 -07:00
greg dffab8ae94 Fixed arg list parsing 2017-09-19 22:10:53 -07:00
greg 48ee6c9a75 Add function name to parse 2017-09-19 22:10:53 -07:00
greg 58d399dace More function parsing 2017-09-19 22:10:53 -07:00
greg 57ea1bae30 Preliminary addition of function blocks 2017-09-19 22:10:53 -07:00
greg e870d8172a Make = a keyword 2017-09-19 22:10:53 -07:00
greg 72b26755a7 Make ParseResult just a normal Result type
No reason for it to be different
2017-09-19 22:10:53 -07:00
greg 1416c9d444 Parse null as simple_expression 2017-09-19 22:10:53 -07:00
greg 5c79563e53 Reduce binop args before reducing full expr 2017-09-19 22:10:53 -07:00
greg c27c900e7f Add operators % and **
semantics for these are hard with floats handle later
2017-09-19 22:10:53 -07:00
greg 50c5dbe96d Evaluate simple binops
Woo this is kind of a neat milestone! I can now compute with this
language!
2017-09-19 22:10:53 -07:00
greg be8c8b3343 Add paren test 2017-09-19 22:10:53 -07:00
greg b856023072 Fixed paren parsing
also made error reporting a bit nicer
2017-09-19 22:10:53 -07:00
greg 868373f409 use match syntax in simple_parse 2017-09-19 22:10:53 -07:00
greg 582a7fd6dc Make parse and tokens optional 2017-09-19 22:10:53 -07:00
greg a947ec3cb2 Add simple parsing test 2017-09-19 22:10:53 -07:00
greg 79619025ea Add directive to print precedence chart 2017-09-19 22:10:53 -07:00
greg 56b338a6a8 Move to global precedence table 2017-09-19 22:10:53 -07:00
greg 7d6f946e22 Fill out a few more precedences 2017-09-19 22:10:53 -07:00
greg 0da7f7e3a1 Fully fixed binop parsing 2017-09-19 22:10:53 -07:00
greg 19a344fa77 Fixed precedent-less binop parsing 2017-09-19 22:10:53 -07:00
greg 626a7f3861 Working on binop parsing 2017-09-19 22:10:53 -07:00
greg 8e3a571d67 .env dirctive to display environment 2017-09-19 22:10:53 -07:00
greg f88f115567 Environment persistent across repl loop 2017-09-19 22:10:53 -07:00
greg 08f1092b69 Variable lookup works
Note this introduces a panic - if the AST node inserted into the
environment is not reduced, it throws an error when trying to look it up
2017-09-19 22:10:53 -07:00
greg 6ddea790c0 Beginning of variable lookup
everything is null
2017-09-19 22:10:53 -07:00
greg 6897eb1283 Implemented variable binding 2017-09-19 22:10:53 -07:00
greg 34fdf2be00 Add machinery for evaluation environments 2017-09-19 22:10:53 -07:00
greg 4ef93fafb5 Cause tokenize error for unclosed strings 2017-09-19 22:10:53 -07:00
greg d2108f0f97 First pass at evaluation
Very incomplete
2017-09-19 22:10:52 -07:00
greg 2989ac338c Implemented binop parsing
Uses Operator-precedence parsing algorithm, which I don't fully
understand.
2017-09-19 22:10:52 -07:00
greg 8b6d54aec2 Fix let clause parsing
let a = x, x should be expression not just simple expression
2017-09-19 22:10:52 -07:00
greg cdb47bb3b9 Add paren parsing 2017-09-19 22:10:52 -07:00
greg 9d6dc5a5f2 Tokenize periods separately 2017-09-19 22:10:52 -07:00
greg 71d2428e57 Update Grammar 2017-09-19 22:10:52 -07:00
greg 8f9bfbc5bd Rename rhs to simple_expression 2017-09-19 22:10:52 -07:00
greg c9fdd5e83c Simplified statements-parsing
Still a little wonky wrt extraneous Separators, need to adjust grammar
to fix I think
2017-09-19 22:10:52 -07:00
greg 30eddf7737 while statements 2017-09-19 22:10:52 -07:00
greg 42719dc2f2 Change 'input' to 'tokens'
just to be consistent
2017-09-19 22:10:52 -07:00
greg 8fcc850d77 Added else clause to if parsing 2017-09-19 22:10:52 -07:00
greg f421918945 basic if expression 2017-09-19 22:10:52 -07:00
greg 0e4469fa58 Filled out keyword tokenizing 2017-09-19 22:10:52 -07:00
greg edf100b583 Starting to do if statement parsing 2017-09-19 22:10:52 -07:00
greg 169e662049 Collapse Separator tokens
only ever gonna be one in a row
2017-09-19 22:10:52 -07:00
greg 46999beabf Added skeleton of expression() parser 2017-09-19 22:10:52 -07:00
greg 1342a76786 Added support for interpreter directives 2017-09-19 22:10:52 -07:00
greg 05238bced3 rhs production 2017-09-19 22:10:52 -07:00
greg a97cce184c Empty program is valid too 2017-09-19 22:10:52 -07:00
greg 1ae61287c1 Editing Grammar
to indicate what portion of it is now parsed
2017-09-19 22:10:52 -07:00
greg 329c521964 Parsing statement blocks works 2017-09-19 22:10:52 -07:00
greg bfa16fd6fb Added Keyword lexical class 2017-09-19 22:10:52 -07:00
greg 25f5188d8c Move definition around 2017-09-19 22:10:52 -07:00
greg 5213dd327f Change type to peekable 2017-09-19 22:10:52 -07:00
greg cea29094cd type alias for Tokens 2017-09-19 22:10:52 -07:00
greg 67eafba97a Put expect into early return macro 2017-09-19 22:10:52 -07:00
greg 1059a88ee6 Separate parsing into module 2017-09-19 22:10:52 -07:00
greg 429ace73bd Move tokenizing into separate module 2017-09-19 22:10:52 -07:00
greg 044e7a6a26 Rename ASTNode -> AST
saves typing
2017-09-19 22:10:52 -07:00
greg dbdae42c1b Add string to AST 2017-09-19 22:10:52 -07:00
greg fc3dcf792d Start writing recursive descent parser
I think I get the idea now
2017-09-19 22:10:52 -07:00
greg 02b34ca105 Wrote expect()
Hopefully correctly?
2017-09-19 22:10:52 -07:00
greg 1e9cd551a6 Add grammar of language to repo
Preliminary, work in progress
2017-09-19 22:10:52 -07:00
greg 9f4330889a Starting parsing work 2017-09-19 22:10:52 -07:00
greg 3058af4f05 Break on ctrl-D 2017-09-19 22:10:52 -07:00
greg 4f17d5a0dc Add number tokenizing 2017-09-19 22:10:52 -07:00
greg 8e3774ffca Comma as separate token 2017-09-19 22:10:52 -07:00
greg c6059ada7d Separators and parens
Separator = ; or \n, they are equivalent
2017-09-19 22:10:52 -07:00
greg b5ee45f639 null is also a keyword 2017-09-19 22:10:51 -07:00
greg 04f53b6beb Added some grammar sketch ideas to the readme 2017-09-19 22:10:51 -07:00
greg 2aaa600d53 More tokenizer stuff 2017-09-19 22:10:51 -07:00
greg c6a92728ee Scaffolding for evaluation function 2017-09-19 22:10:51 -07:00
greg b2e23bed86 Print tokens and parse 2017-09-19 22:10:51 -07:00
greg 3fdacf018e Basic repl 2017-09-19 22:10:51 -07:00
75 changed files with 14569 additions and 10 deletions

3
.gitignore vendored
View File

@ -1 +1,4 @@
target
.schala_repl
.schala_history
rusty-tags.vi

1124
Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,12 +1,17 @@
[package]
name = "null_only_language"
name = "schala"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
simplerepl = { path = "../simplerepl" }
llvm-sys = "*"
schala-repl = { path = "schala-repl" }
schala-lang = { path = "schala-lang/language" }
# maaru-lang = { path = "maaru" }
# rukka-lang = { path = "rukka" }
# robo-lang = { path = "robo" }
[dependencies.iron_llvm]
git = "https://github.com/jauhien/iron-llvm.git"
[build-dependencies]
includedir_codegen = "0.2.0"
[workspace]

920
HindleyMilner.hs Normal file
View File

@ -0,0 +1,920 @@
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedLists #-}
{-# LANGUAGE OverloadedStrings #-}
-- | This module is an extensively documented walkthrough for typechecking a
-- basic functional language using the Hindley-Damas-Milner algorithm.
--
-- In the end, we'll be able to infer the type of expressions like
--
-- @
-- find (λx. (>) x 0)
-- :: [Integer] -> Either () Integer
-- @
--
-- It can be used in multiple different forms:
--
-- * The source is written in literate programming style, so you can almost
-- read it from top to bottom, minus some few references to later topics.
-- * /Loads/ of doctests (runnable and verified code examples) are included
-- * The code is runnable in GHCi, all definitions are exposed.
-- * A small main module that gives many examples of what you might try out in
-- GHCi is also included.
-- * The Haddock output yields a nice overview over the definitions given, with
-- a nice rendering of a truckload of Haddock comments.
module HindleyMilner where
import Control.Monad.Trans
import Control.Monad.Trans.Except
import Control.Monad.Trans.State
import Data.Map (Map)
import qualified Data.Map as M
import Data.Monoid
import Data.Set (Set)
import qualified Data.Set as S
import Data.String
import Data.Text (Text)
import qualified Data.Text as T
-- $setup
--
-- For running doctests:
--
-- >>> :set -XOverloadedStrings
-- >>> :set -XOverloadedLists
-- >>> :set -XLambdaCase
-- >>> import qualified Data.Text.IO as T
-- >>> let putPprLn = T.putStrLn . ppr
-- #############################################################################
-- #############################################################################
-- * Preliminaries
-- #############################################################################
-- #############################################################################
-- #############################################################################
-- ** Prettyprinting
-- #############################################################################
-- | A prettyprinter class. Similar to 'Show', but with a focus on having
-- human-readable output as opposed to being valid Haskell.
class Pretty a where
ppr :: a -> Text
-- #############################################################################
-- ** Names
-- #############################################################################
-- | A 'name' is an identifier in the language we're going to typecheck.
-- Variables on both the term and type level have 'Name's, for example.
newtype Name = Name Text
deriving (Eq, Ord, Show)
-- | >>> "lorem" :: Name
-- Name "lorem"
instance IsString Name where
fromString = Name . T.pack
-- | >>> putPprLn (Name "var")
-- var
instance Pretty Name where
ppr (Name n) = n
-- #############################################################################
-- ** Monotypes
-- #############################################################################
-- | A monotype is an unquantified/unparametric type, in other words it contains
-- no @forall@s. Monotypes are the inner building blocks of all types. Examples
-- of monotypes are @Int@, @a@, @a -> b@.
--
-- In formal notation, 'MType's are often called τ (tau) types.
data MType = TVar Name -- ^ @a@
| TFun MType MType -- ^ @a -> b@
| TConst Name -- ^ @Int@, @()@, …
-- Since we can't declare our own types in our simple type system
-- here, we'll hard-code certain basic ones so we can typecheck some
-- familar functions that use them later.
| TList MType -- ^ @[a]@
| TEither MType MType -- ^ @Either a b@
| TTuple MType MType -- ^ @(a,b)@
deriving Show
-- | >>> putPprLn (TFun (TEither (TVar "a") (TVar "b")) (TFun (TVar "c") (TVar "d")))
-- Either a b → c → d
--
-- Using the 'IsString' instance:
--
-- >>> putPprLn (TFun (TEither "a" "b") (TFun "c" "d"))
-- Either a b → c → d
instance Pretty MType where
ppr = go False
where
go _ (TVar name) = ppr name
go _ (TList a) = "[" <> ppr a <> "]"
go _ (TEither l r) = "Either " <> ppr l <> " " <> ppr r
go _ (TTuple a b) = "(" <> ppr a <> ", " <> ppr b <> ")"
go _ (TConst name) = ppr name
go parenthesize (TFun a b)
| parenthesize = "(" <> lhs <> "" <> rhs <> ")"
| otherwise = lhs <> "" <> rhs
where lhs = go True a
rhs = go False b
-- | >>> "var" :: MType
-- TVar (Name "var")
instance IsString MType where
fromString = TVar . fromString
-- | The free variables of an 'MType'. This is simply the collection of all the
-- individual type variables occurring inside of it.
--
-- __Example:__ The free variables of @a -> b@ are @a@ and @b@.
freeMType :: MType -> Set Name
freeMType = \case
TVar a -> [a]
TFun a b -> freeMType a <> freeMType b
TList a -> freeMType a
TEither l r -> freeMType l <> freeMType r
TTuple a b -> freeMType a <> freeMType b
TConst _ -> []
-- | Substitute all the contained type variables mentioned in the substitution,
-- and leave everything else alone.
instance Substitutable MType where
applySubst s = \case
TVar a -> let Subst s' = s
in M.findWithDefault (TVar a) a s'
TFun f x -> TFun (applySubst s f) (applySubst s x)
TList a -> TList (applySubst s a)
TEither l r -> TEither (applySubst s l) (applySubst s r)
TTuple a b -> TTuple (applySubst s a) (applySubst s b)
c@TConst {} -> c
-- #############################################################################
-- ** Polytypes
-- #############################################################################
-- | A polytype is a monotype universally quantified over a number of type
-- variables. In Haskell, all definitions have polytypes, but since the @forall@
-- is implicit they look a bit like monotypes, maybe confusingly so. For
-- example, the type of @1 :: Int@ is actually @forall <nothing>. Int@, and the
-- type of @id@ is @forall a. a -> a@, although GHC displays it as @a -> a@.
--
-- A polytype claims to work "for all imaginable type parameters", very similar
-- to how a lambda claims to work "for all imaginable value parameters". We can
-- insert a value into a lambda's parameter to evaluate it to a new value, and
-- similarly we'll later insert types into a polytype's quantified variables to
-- gain new types.
--
-- __Example:__ in a definition @id :: forall a. a -> a@, the @a@ after the
-- ∀ ("forall") is the collection of type variables, and @a -> a@ is the 'MType'
-- quantified over. When we have such an @id@, we also have its specialized
-- version @Int -> Int@ available. This process will be the topic of the type
-- inference/unification algorithms.
--
-- In formal notation, 'PType's are often called σ (sigma) types.
--
-- The purpose of having monotypes and polytypes is that we'd like to only have
-- universal quantification at the top level, restricting our language to rank-1
-- polymorphism, where type inferece is total (all types can be inferred) and
-- simple (only a handful of typing rules). Weakening this constraint would be
-- easy: if we allowed universal quantification within function types we would
-- get rank-N polymorphism. Taking it even further to allow it anywhere,
-- effectively replacing all occurrences of 'MType' with 'PType', yields
-- impredicative types. Both these extensions make the type system
-- *significantly* more complex though.
data PType = Forall (Set Name) MType -- ^ ∀{α}. τ
-- | >>> putPprLn (Forall ["a"] (TFun "a" "a"))
-- ∀a. a → a
instance Pretty PType where
ppr (Forall qs mType) = "" <> pprUniversals <> ". " <> ppr mType
where
pprUniversals
| S.null qs = ""
| otherwise = (T.intercalate " " . map ppr . S.toList) qs
-- | The free variables of a 'PType' are the free variables of the contained
-- 'MType', except those universally quantified.
--
-- >>> let sigma = Forall ["a"] (TFun "a" (TFun (TTuple "b" "a") "c"))
-- >>> putPprLn sigma
-- ∀a. a → (b, a) → c
-- >>> let display = T.putStrLn . T.intercalate ", " . foldMap (\x -> [ppr x])
-- >>> display (freePType sigma)
-- b, c
freePType :: PType -> Set Name
freePType (Forall qs mType) = freeMType mType `S.difference` qs
-- | Substitute all the free type variables.
instance Substitutable PType where
applySubst (Subst subst) (Forall qs mType) =
let qs' = M.fromSet (const ()) qs
subst' = Subst (subst `M.difference` qs')
in Forall qs (applySubst subst' mType)
-- #############################################################################
-- ** The environment
-- #############################################################################
-- | The environment consists of all the values available in scope, and their
-- associated polytypes. Other common names for it include "(typing) context",
-- and because of the commonly used symbol for it sometimes directly
-- \"Gamma"/@"Γ"@.
--
-- There are two kinds of membership in an environment,
--
-- - @∈@: an environment @Γ@ can be viewed as a set of @(value, type)@ pairs,
-- and we can test whether something is /literally contained/ by it via
-- x:σ ∈ Γ
-- - @⊢@, pronounced /entails/, describes all the things that are well-typed,
-- given an environment @Γ@. @Γ ⊢ x:τ@ can thus be seen as a judgement that
-- @x:τ@ is /figuratively contained/ in @Γ@.
--
-- For example, the environment @{x:Int}@ literally contains @x@, but given
-- this, it also entails @λy. x@, @λy z. x@, @let id = λy. y in id x@ and so on.
--
-- In Haskell terms, the environment consists of all the things you currently
-- have available, or that can be built by comining them. If you import the
-- Prelude, your environment entails
--
-- @
-- id → ∀a. a→a
-- map → ∀a b. (a→b) → [a] → [b]
-- putStrLn → ∀∅. String → IO ()
-- …
-- id map → ∀a b. (a→b) → [a] → [b]
-- map putStrLn → ∀∅. [String] -> [IO ()]
-- …
-- @
newtype Env = Env (Map Name PType)
-- | >>> :{
-- putPprLn (Env
-- [ ("id", Forall ["a"] (TFun "a" "a"))
-- , ("const", Forall ["a", "b"] (TFun "a" (TFun "b" "a"))) ])
-- :}
-- Γ = { const : ∀a b. a → b → a
-- , id : ∀a. a → a }
instance Pretty Env where
ppr (Env env) = "Γ = { " <> T.intercalate "\n , " pprBindings <> " }"
where
bindings = M.assocs env
pprBinding (name, pType) = ppr name <> " : " <> ppr pType
pprBindings = map pprBinding bindings
-- | The free variables of an 'Env'ironment are all the free variables of the
-- 'PType's it contains.
freeEnv :: Env -> Set Name
freeEnv (Env env) = let allPTypes = M.elems env
in S.unions (map freePType allPTypes)
-- | Performing a 'Subst'itution in an 'Env'ironment means performing that
-- substituion on all the contained 'PType's.
instance Substitutable Env where
applySubst s (Env env) = Env (M.map (applySubst s) env)
-- #############################################################################
-- ** Substitutions
-- #############################################################################
-- | A substitution is a mapping from type variables to 'MType's. Applying a
-- substitution means applying those replacements. For example, the substitution
-- @a -> Int@ applied to @a -> a@ yields the result @Int -> Int@.
--
-- A key concept behind Hindley-Milner is that once we dive deeper into an
-- expression, we learn more about our type variables. We might learn that @a@
-- has to be specialized to @b -> b@, and then later on that @b@ is actually
-- @Int@. Substitutions are an organized way of carrying this information along.
newtype Subst = Subst (Map Name MType)
-- | We're going to apply substitutions to a variety of other values that
-- somehow contain type variables, so we overload this application operation in
-- a class here.
--
-- Laws:
--
-- @
-- 'applySubst' 'mempty' ≡ 'id'
-- 'applySubst' (s1 '<>' s2) ≡ 'applySubst' s1 . 'applySubst' s2
-- @
class Substitutable a where
applySubst :: Subst -> a -> a
instance (Substitutable a, Substitutable b) => Substitutable (a,b) where
applySubst s (x,y) = (applySubst s x, applySubst s y)
-- | @'applySubst' s1 s2@ applies one substitution to another, replacing all the
-- bindings in the second argument @s2@ with their values mentioned in the first
-- one (@s1@).
instance Substitutable Subst where
applySubst s (Subst target) = Subst (fmap (applySubst s) target)
-- | >>> :{
-- putPprLn (Subst
-- [ ("a", TFun "b" "b")
-- , ("b", TEither "c" "d") ])
-- :}
-- { a > b → b
-- , b > Either c d }
instance Pretty Subst where
ppr (Subst s) = "{ " <> T.intercalate "\n, " [ ppr k <> " > " <> ppr v | (k,v) <- M.toList s ] <> " }"
-- | Combine two substitutions by applying all substitutions mentioned in the
-- first argument to the type variables contained in the second.
instance Monoid Subst where
-- Considering that all we can really do with a substitution is apply it, we
-- can use the one of 'Substitutable's laws to show that substitutions
-- combine associatively,
--
-- @
-- applySubst (compose s1 (compose s2 s3))
-- = applySubst s1 . applySubst (compose s2 s3)
-- = applySubst s1 . applySubst s2 . applySubst s3
-- = applySubst (compose s1 s2) . applySubst s3
-- = applySubst (compose (compose s1 s2) s3)
-- @
mappend subst1 subst2 = Subst (s1 `M.union` s2)
where
Subst s1 = subst1
Subst s2 = applySubst subst1 subst2
mempty = Subst M.empty
-- #############################################################################
-- #############################################################################
-- * Typechecking
-- #############################################################################
-- #############################################################################
-- $ Typechecking does two things:
--
-- 1. If two types are not immediately identical, attempt to 'unify' them
-- to get a type compatible with both of them
-- 2. 'infer' the most general type of a value by comparing the values in its
-- definition with the 'Env'ironment
-- #############################################################################
-- ** Inference context
-- #############################################################################
-- | The inference type holds a supply of unique names, and can fail with a
-- descriptive error if something goes wrong.
--
-- /Invariant:/ the supply must be infinite, or we might run out of names to
-- give to things.
newtype Infer a = Infer (ExceptT InferError (State [Name]) a)
deriving (Functor, Applicative, Monad)
-- | Errors that can happen during the type inference process.
data InferError =
-- | Two types that don't match were attempted to be unified.
--
-- For example, @a -> a@ and @Int@ do not unify.
--
-- >>> putPprLn (CannotUnify (TFun "a" "a") (TConst "Int"))
-- Cannot unify a → a with Int
CannotUnify MType MType
-- | A 'TVar' is bound to an 'MType' that already contains it.
--
-- The canonical example of this is @λx. x x@, where the first @x@
-- in the body has to have type @a -> b@, and the second one @a@. Since
-- they're both the same @x@, this requires unification of @a@ with
-- @a -> b@, which only works if @a = a -> b = (a -> b) -> b = …@, yielding
-- an infinite type.
--
-- >>> putPprLn (OccursCheckFailed "a" (TFun "a" "a"))
-- Occurs check failed: a already appears in a → a
| OccursCheckFailed Name MType
-- | The value of an unknown identifier was read.
--
-- >>> putPprLn (UnknownIdentifier "a")
-- Unknown identifier: a
| UnknownIdentifier Name
deriving Show
-- | >>> putPprLn (CannotUnify (TEither "a" "b") (TTuple "a" "b"))
-- Cannot unify Either a b with (a, b)
instance Pretty InferError where
ppr = \case
CannotUnify t1 t2 ->
"Cannot unify " <> ppr t1 <> " with " <> ppr t2
OccursCheckFailed name ty ->
"Occurs check failed: " <> ppr name <> " already appears in " <> ppr ty
UnknownIdentifier name ->
"Unknown identifier: " <> ppr name
-- | Evaluate a value in an 'Infer'ence context.
--
-- >>> let expr = EAbs "f" (EAbs "g" (EAbs "x" (EApp (EApp "f" "x") (EApp "g" "x"))))
-- >>> putPprLn expr
-- λf g x. f x (g x)
-- >>> let inferred = runInfer (infer (Env []) expr)
-- >>> let demonstrate = \case Right (_, ty) -> T.putStrLn (":: " <> ppr ty)
-- >>> demonstrate inferred
-- :: (c → e → f) → (c → e) → c → f
runInfer :: Infer a -- ^ Inference data
-> Either InferError a
runInfer (Infer inf) =
evalState (runExceptT inf) (map Name (infiniteSupply alphabet))
where
alphabet = map T.singleton ['a'..'z']
-- [a, b, c] ==> [a,b,c, a1,b1,c1, a2,b2,c2, …]
infiniteSupply supply = supply <> addSuffixes supply (1 :: Integer)
where
addSuffixes xs n = map (\x -> addSuffix x n) xs <> addSuffixes xs (n+1)
addSuffix x n = x <> T.pack (show n)
-- | Throw an 'InferError' in an 'Infer'ence context.
--
-- >>> case runInfer (throw (UnknownIdentifier "var")) of Left err -> putPprLn err
-- Unknown identifier: var
throw :: InferError -> Infer a
throw = Infer . throwE
-- #############################################################################
-- ** Unification
-- #############################################################################
-- $ Unification describes the process of making two different types compatible
-- by specializing them where needed. A desirable property to have here is being
-- able to find the most general unifier. Luckily, we'll be able to do that in
-- our type system.
-- | The unification of two 'MType's is the most general substituion that can be
-- applied to both of them in order to yield the same result.
--
-- >>> let m1 = TFun "a" "b"
-- >>> putPprLn m1
-- a → b
-- >>> let m2 = TFun "c" (TEither "d" "e")
-- >>> putPprLn m2
-- c → Either d e
-- >>> let inferSubst = unify (m1, m2)
-- >>> case runInfer inferSubst of Right subst -> putPprLn subst
-- { a > c
-- , b > Either d e }
unify :: (MType, MType) -> Infer Subst
unify = \case
(TFun a b, TFun x y) -> unifyBinary (a,b) (x,y)
(TVar v, x) -> v `bindVariableTo` x
(x, TVar v) -> v `bindVariableTo` x
(TConst a, TConst b) | a == b -> pure mempty
(TList a, TList b) -> unify (a,b)
(TEither a b, TEither x y) -> unifyBinary (a,b) (x,y)
(TTuple a b, TTuple x y) -> unifyBinary (a,b) (x,y)
(a, b) -> throw (CannotUnify a b)
where
-- Unification of binary type constructors, such as functions and Either.
-- Unification is first done for the first operand, and assuming the
-- required substitution, for the second one.
unifyBinary :: (MType, MType) -> (MType, MType) -> Infer Subst
unifyBinary (a,b) (x,y) = do
s1 <- unify (a, x)
s2 <- unify (applySubst s1 (b, y))
pure (s1 <> s2)
-- | Build a 'Subst'itution that binds a 'Name' of a 'TVar' to an 'MType'. The
-- resulting substitution should be idempotent, i.e. applying it more than once
-- to something should not be any different from applying it only once.
--
-- - In the simplest case, this just means building a substitution that just
-- does that.
-- - Substituting a 'Name' with a 'TVar' with the same name unifies a type
-- variable with itself, and the resulting substitution does nothing new.
-- - If the 'Name' we're trying to bind to an 'MType' already occurs in that
-- 'MType', the resulting substitution would not be idempotent: the 'MType'
-- would be replaced again, yielding a different result. This is known as the
-- Occurs Check.
bindVariableTo :: Name -> MType -> Infer Subst
bindVariableTo name (TVar v) | boundToSelf = pure mempty
where
boundToSelf = name == v
bindVariableTo name mType | name `occursIn` mType = throw (OccursCheckFailed name mType)
where
n `occursIn` ty = n `S.member` freeMType ty
bindVariableTo name mType = pure (Subst (M.singleton name mType))
-- #############################################################################
-- ** Type inference
-- #############################################################################
-- $ Type inference is the act of finding out a value's type by looking at the
-- environment it is in, in order to make it compatible with it.
--
-- In literature, the Hindley-Damas-Milner inference algorithm ("Algorithm W")
-- is often presented in the style of logical formulas, and below you'll find
-- that version along with code that actually does what they say.
--
-- These formulas look a bit like fractions, where the "numerator" is a
-- collection of premises, and the denominator is the consequence if all of them
-- hold.
--
-- __Example:__
--
-- @
-- Γ ⊢ even : Int → Bool Γ ⊢ 1 : Int
--
-- Γ ⊢ even 1 : Bool
-- @
--
-- means that if we have a value of type @Int -> Bool@ called "even" and a value
-- of type @Int@ called @1@, then we also have a value of type @Bool@ via
-- @even 1@ available to us.
--
-- The actual inference rules are polymorphic versions of this example, and
-- the code comments will explain each step in detail.
-- -----------------------------------------------------------------------------
-- *** The language: typed lambda calculus
-- -----------------------------------------------------------------------------
-- | The syntax tree of the language we'd like to typecheck. You can view it as
-- a close relative to simply typed lambda calculus, having only the most
-- necessary syntax elements.
--
-- Since 'ELet' is non-recursive, the usual fixed-point function
-- @fix : (a → a) → a@ can be introduced to allow recursive definitions.
data Exp = ELit Lit -- ^ True, 1
| EVar Name -- ^ @x@
| EApp Exp Exp -- ^ @f x@
| EAbs Name Exp -- ^ @λx. e@
| ELet Name Exp Exp -- ^ @let x = e in e'@ (non-recursive)
deriving Show
-- | Literals we'd like to support. Since we can't define new data types in our
-- simple type system, we'll have to hard-code the possible ones here.
data Lit = LBool Bool
| LInteger Integer
deriving Show
-- | >>> putPprLn (EAbs "f" (EAbs "g" (EAbs "x" (EApp (EApp "f" "x") (EApp "g" "x")))))
-- λf g x. f x (g x)
instance Pretty Exp where
ppr (ELit lit) = ppr lit
ppr (EVar name) = ppr name
ppr (EApp f x) = pprApp1 f <> " " <> pprApp2 x
where
pprApp1 = \case
eLet@ELet{} -> "(" <> ppr eLet <> ")"
eLet@EAbs{} -> "(" <> ppr eLet <> ")"
e -> ppr e
pprApp2 = \case
eApp@EApp{} -> "(" <> ppr eApp <> ")"
e -> pprApp1 e
ppr x@EAbs{} = pprAbs True x
where
pprAbs True (EAbs name expr) = "λ" <> ppr name <> pprAbs False expr
pprAbs False (EAbs name expr) = " " <> ppr name <> pprAbs False expr
pprAbs _ expr = ". " <> ppr expr
ppr (ELet name value body) =
"let " <> ppr name <> " = " <> ppr value <> " in " <> ppr body
-- | >>> putPprLn (LBool True)
-- True
--
-- >>> putPprLn (LInteger 127)
-- 127
instance Pretty Lit where
ppr = \case
LBool b -> showT b
LInteger i -> showT i
where
showT :: Show a => a -> Text
showT = T.pack . show
-- | >>> "var" :: Exp
-- EVar (Name "var")
instance IsString Exp where
fromString = EVar . fromString
-- -----------------------------------------------------------------------------
-- *** Some useful definitions
-- -----------------------------------------------------------------------------
-- | Generate a fresh 'Name' in a type 'Infer'ence context. An example use case
-- of this is η expansion, which transforms @f@ into @λx. f x@, where "x" is a
-- new name, i.e. unbound in the current context.
fresh :: Infer MType
fresh = drawFromSupply >>= \case
Right name -> pure (TVar name)
Left err -> throw err
where
drawFromSupply :: Infer (Either InferError Name)
drawFromSupply = Infer (do
s:upply <- lift get
lift (put upply)
pure (Right s) )
-- | Add a new binding to the environment.
--
-- The Haskell equivalent would be defining a new value, for example in module
-- scope or in a @let@ block. This corresponds to the "comma" operation used in
-- formal notation,
--
-- @
-- Γ, x:σ ≡ extendEnv Γ (x,σ)
-- @
extendEnv :: Env -> (Name, PType) -> Env
extendEnv (Env env) (name, pType) = Env (M.insert name pType env)
-- -----------------------------------------------------------------------------
-- *** Inferring the types of all language constructs
-- -----------------------------------------------------------------------------
-- | Infer the type of an 'Exp'ression in an 'Env'ironment, resulting in the
-- 'Exp's 'MType' along with a substitution that has to be done in order to reach
-- this goal.
--
-- This is widely known as /Algorithm W/.
infer :: Env -> Exp -> Infer (Subst, MType)
infer env = \case
ELit lit -> inferLit lit
EVar name -> inferVar env name
EApp f x -> inferApp env f x
EAbs x e -> inferAbs env x e
ELet x e e' -> inferLet env x e e'
-- | Literals such as 'True' and '1' have their types hard-coded.
inferLit :: Lit -> Infer (Subst, MType)
inferLit lit = pure (mempty, TConst litTy)
where
litTy = case lit of
LBool {} -> "Bool"
LInteger {} -> "Integer"
-- | Inferring the type of a variable is done via
--
-- @
-- x:σ ∈ Γ τ = instantiate(σ)
-- [Var]
-- Γ ⊢ x:τ
-- @
--
-- This means that if @Γ@ /literally contains/ (@∈@) a value, then it also
-- /entails it/ (@⊢@) in all its instantiations.
inferVar :: Env -> Name -> Infer (Subst, MType)
inferVar env name = do
sigma <- lookupEnv env name -- x:σ ∈ Γ
tau <- instantiate sigma -- τ = instantiate(σ)
-- ------------------
pure (mempty, tau) -- Γ ⊢ x:τ
-- | Look up the 'PType' of a 'Name' in the 'Env'ironment.
--
-- This checks whether @x:σ@ is /literally contained/ in @Γ@. For more details
-- about this, see the documentation of 'Env'.
--
-- To give a Haskell analogon, looking up @id@ when @Prelude@ is loaded, the
-- resulting 'PType' would be @id@'s type, namely @forall a. a -> a@.
lookupEnv :: Env -> Name -> Infer PType
lookupEnv (Env env) name = case M.lookup name env of
Just x -> pure x
Nothing -> throw (UnknownIdentifier name)
-- | Bind all quantified variables of a 'PType' to 'fresh' type variables.
--
-- __Example:__ instantiating @forall a. a -> b -> a@ results in the 'MType'
-- @c -> b -> c@, where @c@ is a fresh name (to avoid shadowing issues).
--
-- You can picture the 'PType' to be the prototype converted to an instantiated
-- 'MType', which can now be used in the unification process.
--
-- Another way of looking at it is by simply forgetting which variables were
-- quantified, carefully avoiding name clashes when doing so.
--
-- 'instantiate' can also be seen as the opposite of 'generalize', which we'll
-- need later to convert an 'MType' to a 'PType'.
instantiate :: PType -> Infer MType
instantiate (Forall qs t) = do
subst <- substituteAllWithFresh qs
pure (applySubst subst t)
where
-- For each given name, add a substitution from that name to a fresh type
-- variable to the result.
substituteAllWithFresh :: Set Name -> Infer Subst
substituteAllWithFresh xs = do
let freshSubstActions = M.fromSet (const fresh) xs
freshSubsts <- sequenceA freshSubstActions
pure (Subst freshSubsts)
-- | Function application captures the fact that if we have a function and an
-- argument we can give to that function, we also have the result value of the
-- result type available to us.
--
-- @
-- Γ ⊢ f : fτ Γ ⊢ x : xτ fxτ = fresh unify(fτ, xτ → fxτ)
-- [App]
-- Γ ⊢ f x : fxτ
-- @
--
-- This rule says that given a function and a value with a type, the function
-- type has to unify with a function type that allows the value type to be its
-- argument.
inferApp
:: Env
-> Exp -- ^ __f__ x
-> Exp -- ^ f __x__
-> Infer (Subst, MType)
inferApp env f x = do
(s1, fTau) <- infer env f -- f : fτ
(s2, xTau) <- infer (applySubst s1 env) x -- x : xτ
fxTau <- fresh -- fxτ = fresh
s3 <- unify (applySubst s2 fTau, TFun xTau fxTau) -- unify (fτ, xτ → fxτ)
let s = s3 <> s2 <> s1 -- --------------------
pure (s, applySubst s3 fxTau) -- f x : fxτ
-- | Lambda abstraction is based on the fact that when we introduce a new
-- variable, the resulting lambda maps from that variable's type to the type of
-- the body.
--
-- @
-- τ = fresh σ = ∀∅. τ Γ, x:σ ⊢ e:τ'
-- [Abs]
-- Γ ⊢ λx.e : τ→τ'
-- @
--
-- Here, @Γ, x:τ@ is @Γ@ extended by one additional mapping, namely @x:τ@.
--
-- Abstraction is typed by extending the environment by a new 'MType', and if
-- under this assumption we can construct a function mapping to a value of that
-- type, we can say that the lambda takes a value and maps to it.
inferAbs
:: Env
-> Name -- ^ λ__x__. e
-> Exp -- ^ λx. __e__
-> Infer (Subst, MType)
inferAbs env x e = do
tau <- fresh -- τ = fresh
let sigma = Forall [] tau -- σ = ∀∅. τ
env' = extendEnv env (x, sigma) -- Γ, x:σ
(s, tau') <- infer env' e -- … ⊢ e:τ'
-- ---------------
pure (s, TFun (applySubst s tau) tau') -- λx.e : τ→τ'
-- | A let binding allows extending the environment with new bindings in a
-- principled manner. To do this, we first have to typecheck the expression to
-- be introduced. The result of this is then generalized to a 'PType', since let
-- bindings introduce new polymorphic values, which are then added to the
-- environment. Now we can finally typecheck the body of the "in" part of the
-- let binding.
--
-- Note that in our simple language, let is non-recursive, but recursion can be
-- introduced as usual by adding a primitive @fix : (a → a) → a@ if desired.
--
-- @
-- Γ ⊢ e:τ σ = gen(Γ,τ) Γ, x:σ ⊢ e':τ'
-- [Let]
-- Γ ⊢ let x = e in e' : τ'
-- @
inferLet
:: Env
-> Name -- ^ let __x__ = e in e'
-> Exp -- ^ let x = __e__ in e'
-> Exp -- ^ let x = e in __e'__
-> Infer (Subst, MType)
inferLet env x e e' = do
(s1, tau) <- infer env e -- Γ ⊢ e:τ
let env' = applySubst s1 env
let sigma = generalize env' tau -- σ = gen(Γ,τ)
let env'' = extendEnv env' (x, sigma) -- Γ, x:σ
(s2, tau') <- infer env'' e' -- Γ ⊢ …
-- --------------------------
pure (s2 <> s1, tau') -- … let x = e in e' : τ'
-- | Generalize an 'MType' to a 'PType' by universally quantifying over all the
-- type variables contained in it, except those already free in the environment.
--
-- >>> let tau = TFun "a" (TFun "b" "a")
-- >>> putPprLn tau
-- a → b → a
-- >>> putPprLn (generalize (Env [("x", Forall [] "b")]) tau)
-- ∀a. a → b → a
--
-- In more formal notation,
--
-- @
-- gen(Γ,τ) = ∀{α}. τ
-- where {α} = free(τ) free(Γ)
-- @
--
-- 'generalize' can also be seen as the opposite of 'instantiate', which
-- converts a 'PType' to an 'MType'.
generalize :: Env -> MType -> PType
generalize env mType = Forall qs mType
where
qs = freeMType mType `S.difference` freeEnv env

185
Main.hs Normal file
View File

@ -0,0 +1,185 @@
{-# LANGUAGE OverloadedLists #-}
{-# LANGUAGE OverloadedStrings #-}
module Main where
import qualified Data.Map as M
import Data.Monoid
import Data.Text (Text)
import qualified Data.Text.IO as T
import HindleyMilner
-- #############################################################################
-- #############################################################################
-- * Testing
-- #############################################################################
-- #############################################################################
-- #############################################################################
-- ** A small custom Prelude
-- #############################################################################
prelude :: Env
prelude = Env (M.fromList
[ ("(*)", Forall [] (tInteger ~> tInteger ~> tInteger))
, ("(+)", Forall [] (tInteger ~> tInteger ~> tInteger))
, ("(,)", Forall ["a","b"] ("a" ~> "b" ~> TTuple "a" "b"))
, ("(-)", Forall [] (tInteger ~> tInteger ~> tInteger))
, ("(.)", Forall ["a", "b", "c"] (("b" ~> "c") ~> ("a" ~> "b") ~> "a" ~> "c"))
, ("(<)", Forall [] (tInteger ~> tInteger ~> tBool))
, ("(<=)", Forall [] (tInteger ~> tInteger ~> tBool))
, ("(>)", Forall [] (tInteger ~> tInteger ~> tBool))
, ("(>=)", Forall [] (tInteger ~> tInteger ~> tBool))
, ("const", Forall ["a","b"] ("a" ~> "b" ~> "a"))
, ("Cont/>>=", Forall ["a"] ((("a" ~> "r") ~> "r") ~> ("a" ~> (("b" ~> "r") ~> "r")) ~> (("b" ~> "r") ~> "r")))
, ("find", Forall ["a","b"] (("a" ~> tBool) ~> TList "a" ~> tMaybe "a"))
, ("fix", Forall ["a"] (("a" ~> "a") ~> "a"))
, ("foldr", Forall ["a","b"] (("a" ~> "b" ~> "b") ~> "b" ~> TList "a" ~> "b"))
, ("id", Forall ["a"] ("a" ~> "a"))
, ("ifThenElse", Forall ["a"] (tBool ~> "a" ~> "a" ~> "a"))
, ("Left", Forall ["a","b"] ("a" ~> TEither "a" "b"))
, ("length", Forall ["a"] (TList "a" ~> tInteger))
, ("map", Forall ["a","b"] (("a" ~> "b") ~> TList "a" ~> TList "b"))
, ("reverse", Forall ["a"] (TList "a" ~> TList "a"))
, ("Right", Forall ["a","b"] ("b" ~> TEither "a" "b"))
, ("[]", Forall ["a"] (TList "a"))
, ("(:)", Forall ["a"] ("a" ~> TList "a" ~> TList "a"))
])
where
tBool = TConst "Bool"
tInteger = TConst "Integer"
tMaybe = TEither (TConst "()")
-- | Synonym for 'TFun' to make writing type signatures easier.
--
-- Instead of
--
-- @
-- Forall ["a","b"] (TFun "a" (TFun "b" "a"))
-- @
--
-- we can write
--
-- @
-- Forall ["a","b"] ("a" ~> "b" ~> "a")
-- @
(~>) :: MType -> MType -> MType
(~>) = TFun
infixr 9 ~>
-- #############################################################################
-- ** Run it!
-- #############################################################################
-- | Run type inference on a cuple of values
main :: IO ()
main = do
let inferAndPrint = T.putStrLn . (" " <>) . showType prelude
T.putStrLn "Well-typed:"
do
inferAndPrint (lambda ["x"] "x")
inferAndPrint (lambda ["f","g","x"] (apply "f" ["x", apply "g" ["x"]]))
inferAndPrint (lambda ["f","g","x"] (apply "f" [apply "g" ["x"]]))
inferAndPrint (lambda ["m", "k", "c"] (apply "m" [lambda ["x"] (apply "k" ["x", "c"])])) -- >>= for Cont
inferAndPrint (lambda ["f"] (apply "(.)" ["reverse", apply "map" ["f"]]))
inferAndPrint (apply "find" [lambda ["x"] (apply "(>)" ["x", int 0])])
inferAndPrint (apply "map" [apply "map" ["map"]])
inferAndPrint (apply "(*)" [int 1, int 2])
inferAndPrint (apply "foldr" ["(+)", int 0])
inferAndPrint (apply "map" ["length"])
inferAndPrint (apply "map" ["map"])
inferAndPrint (lambda ["x"] (apply "ifThenElse" [apply "(<)" ["x", int 0], int 0, "x"]))
inferAndPrint (lambda ["x"] (apply "fix" [lambda ["xs"] (apply "(:)" ["x", "xs"])]))
T.putStrLn "Ill-typed:"
do
inferAndPrint (apply "(*)" [int 1, bool True])
inferAndPrint (apply "foldr" [int 1])
inferAndPrint (lambda ["x"] (apply "x" ["x"]))
inferAndPrint (lambda ["x"] (ELet "xs" (apply "(:)" ["x", "xs"]) "xs"))
-- | Build multiple lambda bindings.
--
-- Instead of
--
-- @
-- EAbs "f" (EAbs "x" (EApp "f" "x"))
-- @
--
-- we can write
--
-- @
-- lambda ["f", "x"] (EApp "f" "x")
-- @
--
-- for
--
-- @
-- λf x. f x
-- @
lambda :: [Name] -> Exp -> Exp
lambda names expr = foldr EAbs expr names
-- | Apply a function to multiple arguments.
--
-- Instead of
--
-- @
-- EApp (EApp (EApp "f" "x") "y") "z")
-- @
--
-- we can write
--
-- @
-- apply "f" ["x", "y", "z"]
-- @
--
-- for
--
-- @
-- f x y z
-- @
apply :: Exp -> [Exp] -> Exp
apply = foldl EApp
-- | Construct an integer literal.
int :: Integer -> Exp
int = ELit . LInteger
-- | Construct a boolean literal.
bool :: Bool -> Exp
bool = ELit . LBool
-- | Convenience function to run type inference algorithm
showType :: Env -- ^ Starting environment, e.g. 'prelude'.
-> Exp -- ^ Expression to typecheck
-> Text -- ^ Text representation of the result. Contains an error
-- message on failure.
showType env expr =
case (runInfer . fmap (generalize (Env mempty) . uncurry applySubst) . infer env) expr of
Left err -> "Error inferring type of " <> ppr expr <>": " <> ppr err
Right ty -> ppr expr <> " :: " <> ppr ty

4
README
View File

@ -1,4 +0,0 @@
No-runtime-value-error-language
A language wth a largely-python-like where there are no value errors. Can call null like a function

95
README.md Normal file
View File

@ -0,0 +1,95 @@
# Schala - a programming language meta-interpreter
Schala is a Rust framework written to make it easy to create and experiment
with multipl toy programming languages. It provides a cross-language REPL and
provisions for tokenizing text, parsing tokens, evaluating an abstract syntax
tree, and other tasks that are common to all programming languages, as well as sharing state
between multiple programming languages.
Schala is implemented as a Rust library `schala-repl`, which provides a
function `start_repl`, meant to be used as entry point into a common REPL or
non-interactive environment. Clients are expected to invoke `start_repl` with a
vector of programming languages. Individual programming language
implementations are Rust types that implement the
`ProgrammingLanguageInterface` trait and store whatever persistent state is
relevant to that language.
Run schala with: `cargo run`. This will drop you into a REPL environment. Type
`:help` for more information, or type in text in any supported programming
language (currently only schala-lang) to evaluate it in the REPL.
## History
Schala started out life as an experiment in writing a Javascript-like
programming language that would never encounter any kind of runtime value
error, but rather always return `null` under any kind of error condition. I had
seen one too many Javascript `Uncaught TypeError: Cannot read property ___ of
undefined` messages, and I was a bit frustrated. Plus I had always wanted to
write a programming langauge from scratch, and Rust is a fun language to
program in. Over time I became interested in playing around with other sorts
of programming languages as well, and wanted to make the process as general as
possible.
The name of the project comes from Schala the Princess of Zeal from the 1995
SNES RPG *Chrono Trigger*. I like classic JRPGs and enjoyed the thought of
creating a language name confusingly close to Scala. The naming scheme for
languages implemented with the Schala meta-interpreter is Chrono Trigger
characters.
Schala and languages implemented with it are incomplete alpha software and are
not ready for public release.
## Languages implemented using the meta-interpreter
* The eponymous *Schala* language is a work-in-progress general purpose
programming language with static typing and algebraic data types. Its design
goals include having a very straightforward implemenation and being syntactically
minimal.
* *Maaru* is a very simple dynamically-typed scripting language, with the semantics
that all runtime errors return a `null` value rather than fail.
* *Robo* is an experiment in creating a lazy, functional, strongly-typed language
much like Haskell
* *Rukka* is a straightforward LISP implementation
## Reference works
Here's a partial list of resources I've made use of in the process
of learning how to write a programming language.
### General
http://thume.ca/2019/04/18/writing-a-compiler-in-rust/
### Type-checking
https://skillsmatter.com/skillscasts/10868-inside-the-rust-compiler
https://www.youtube.com/watch?v=il3gD7XMdmA
http://dev.stephendiehl.com/fun/006_hindley_milner.html
https://rust-lang-nursery.github.io/rustc-guide/type-inference.html
https://eli.thegreenplace.net/2018/unification/
https://eli.thegreenplace.net/2018/type-inference/
http://smallcultfollowing.com/babysteps/blog/2017/03/25/unification-in-chalk-part-1/
http://reasonableapproximation.net/2019/05/05/hindley-milner.html
https://rickyhan.com/jekyll/update/2018/05/26/hindley-milner-tutorial-rust.html
### Evaluation
*Understanding Computation*, Tom Stuart, O'Reilly 2013
*Basics of Compiler Design*, Torben Mogensen
### Parsing
http://journal.stuffwithstuff.com/2011/03/19/pratt-parsers-expression-parsing-made-easy/
https://soc.github.io/languages/unified-condition-syntax
[Crafting Interpreters](http://www.craftinginterpreters.com/)
### LLVM
http://blog.ulysse.io/2016/07/03/llvm-getting-started.html
###Rust resources
https://thefullsnack.com/en/rust-for-the-web.html
https://rocket.rs/guide/getting-started/

178
TODO.md Normal file
View File

@ -0,0 +1,178 @@
# Plan of attack
1. modify visitor so it can handle scopes
-this is needed both to handle import scope correctly
-and also to support making FQSNs aware of function parameters
2. Once FQSNs are aware of function parameters, most of the Rc<String> things in eval.rs can go away
# TODO items
-use 'let' sigil in patterns for variables :
```
q is MyStruct(let a, Chrono::Trigga) then {
}
```
-idea: what if there was something like React jsx syntas built in? i.e. a way to automatically transform some kind of markup
into a function call, cf. `<h1 prop="arg">` -> h1(prop=arg)
## General code cleanup
- I think I can restructure the parser to get rid of most instances of expect!, at least at the beginning of a rule
DONE -experiment with storing metadata via ItemIds on AST nodes (cf. https://rust-lang.github.io/rustc-guide/hir.html, https://github.com/rust-lang/rust/blob/master/src/librustc/hir/mod.rs )
-implement and test open/use statements
-implement field access
- standardize on an error type that isn't String
-implement a visitor pattern for the use of scope_resolver
- maybe implement this twice: 1) the value-returning, no-default one in the haoyi blogpost,
-look at https://gitlab.haskell.org/ghc/ghc/wikis/pattern-synonyms
2) the non-value-returning, default one like in rustc (cf. https://github.com/rust-unofficial/patterns/blob/master/patterns/visitor.md)
-parser error - should report subset of AST parsed *so far*
- what if you used python 'def' syntax to define a function? what error message makes sense here?
## Reduction
- make a good type for actual language builtins to avoid string comparisons
## Typechecking
- make a type to represent types rather than relying on string comparisons
- look at https://rickyhan.com/jekyll/update/2018/05/26/hindley-milner-tutorial-rust.html
- cf. the notation mentioned in the cardelli paper, the debug information for the `typechecking` pass should
print the generated type variable for every subexpression in an expression
- think about idris-related ideas of multiple implementations of a type for an interface (+ vs * impl for monoids, for preorder/inorder/postorder for Foldable)
-should have an Idris-like `cast To From` function
## Schala-lang syntax
-idea: the `type` declaration should have some kind of GADT-like syntax
- Idea: if you have a pattern-match where one variant has a variable and the other lacks it
instead of treating this as a type error, promote the bound variable to an option type
- Include extensible scala-style html"string ${var}" string interpolations
- A neat idea for pattern matching optimization would be if you could match on one of several things in a list
ex:
```if x {
is (comp, LHSPat, RHSPat) if comp in ["==, "<"] -> ...
}```
- Schala should have both currying *and* default arguments!
```fn a(b: Int, c:Int, d:Int = 1) -> Int
a(1,2) : Int
a(1,2,d=2): Int
a(_,1,3) : Int -> Int
a(1,2, c=_): Int -> Int
a(_,_,_) : Int -> Int -> Int -> Int
```
- scoped types - be able to define a quick enum type scoped to a function or other type for
something, that only is meant to be used as a quick bespoke interface between
two other things
ex.
```type enum {
type enum MySubVariant {
SubVariant1, SubVariant2, etc.
}
Variant1(MySubVariant),
Variant2(...),
}```
- inclusive/exclusive range syntax like .. vs ..=
## Compilation
-look into Inkwell for rust LLVM bindings
-https://cranelift.readthedocs.io/en/latest/?badge=latest<Paste>
## Other links of note
- https://nshipster.com/never/
-consult http://gluon-lang.org/book/embedding-api.html
## Trying if-syntax again
//simple if expr
if x == 10 then "a" else "z"
//complex if expr
if x == 10 then {
let a = 1
let b = 2
a + b
} else {
55
}
// different comparison ops
if x {
== 1 then "a"
.isPrime() then "b"
else "c"
}
/* for now disallow `if x == { 1 then ... }`, b/c hard to parse
//simple pattern-matching
if x is Person("Ivan", age) then age else 0
//match-block equivalent
if x {
is Person("Ivan", _) then "Ivan"
is Person(_, age) if age > 13 then "barmitzvah'd"
else "foo"
}
## (OLD) Playing around with conditional syntax ideas
- if/match playground
simple if
`if x == 1.0 { "a" } else { "b" }`
one comparison multiple targets:
`if x == { 1.0 -> "a", 2.0 -> "b", else -> "c" }`
different comparison operators/ method calls:
`if x { == 1.0 -> "a", eq NaN -> "n", .hella() -> "h", else -> "z" }`
pattern matching/introducing bindings:
`if alice { .age < 18 -> "18", is Person("Alice", age) -> "${age}", else -> "none" }`
pattern matching w/ if-let:
`if person is Person("Alice", age) { "${age}" } else { "nope" }`
-https://soc.github.io/languages/unified-condition-syntax syntax:
`if <cond-expr>" then <then-expr> else <else-expr>`
`if <half-expr> \n <rest-expr1> then <result1-expr> \n <rest-expr2> then <result-expr2> else <result3-expr>`
-and rest-exprs (or "targets") can have 'is' for pattern-matching, actually so can a full cond-expr
UNIFIED IF EXPRESSIONS FINAL WORK:
basic syntax:
`if_expr := if discriminator '{' (guard_expr)* '}'`
`guard_expr := pattern 'then' block_or_expr'`
`pattern := rhs | is_pattern`
`is_pattern := 'is' ???`
`rhs := expression | ???`
if the only two guard patterns are true and false, then the abbreviated syntax:
`'if' discriminator 'then' block_or_expr 'else' block_or_expr`
can replace `'if' discriminator '{' 'true' 'then' block_or_expr; 'false' 'then' block_or_expr '}'`

11
maaru/Cargo.toml Normal file
View File

@ -0,0 +1,11 @@
[package]
name = "maaru-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
itertools = "0.5.8"
take_mut = "0.1.3"
llvm-sys = "*"
schala-repl = { path = "../schala-repl" }

481
maaru/src/eval.rs Normal file
View File

@ -0,0 +1,481 @@
extern crate take_mut;
use std::collections::HashMap;
use std::collections::VecDeque;
use parser::{AST, Statement, Expression, Function, Callable, BinOp};
use std::rc::Rc;
use std::io::{Write, Stdout, BufWriter};
use std::convert::From;
use parser::Expression::*;
use parser::Statement::*;
type Reduction<T> = (T, Option<SideEffect>);
#[derive(Debug, Clone)]
enum ReducedValue {
StringLiteral(Rc<String>),
ListLiteral(VecDeque<Expression>),
StructLiteral(VecDeque<(Rc<String>, Expression)>),
Number(f64),
Lambda(Function),
}
impl From<ReducedValue> for Expression {
fn from(rv: ReducedValue) -> Expression {
match rv {
ReducedValue::Number(n) => Expression::Number(n),
ReducedValue::StringLiteral(n) => Expression::StringLiteral(n),
ReducedValue::Lambda(f) => Expression::Lambda(f),
ReducedValue::ListLiteral(items) => Expression::ListLiteral(items),
ReducedValue::StructLiteral(items) => Expression::StructLiteral(items),
}
}
}
impl From<Expression> for ReducedValue {
fn from(rv: Expression) -> ReducedValue {
match rv {
Expression::Number(n) => ReducedValue::Number(n),
Expression::StringLiteral(n) => ReducedValue::StringLiteral(n),
Expression::Lambda(f) => ReducedValue::Lambda(f),
Expression::ListLiteral(items) => ReducedValue::ListLiteral(items),
Expression::StructLiteral(items) => ReducedValue::StructLiteral(items),
_ => panic!("trying to store a non-fully-reduced variable"),
}
}
}
fn get_indexer(f: f64) -> Option<usize> {
if f.fract() == 0.0 {
if f.trunc() >= 0.0 {
return Some(f.trunc() as usize);
}
}
None
}
#[derive(Debug)]
enum SideEffect {
Print(String),
AddBinding(Rc<String>, ReducedValue),
}
pub struct Evaluator<'a> {
parent: Option<&'a Evaluator<'a>>,
variables: HashMap<String, ReducedValue>,
stdout: BufWriter<Stdout>,
pub trace_evaluation: bool,
}
impl<'a> Evaluator<'a> {
pub fn new(parent: Option<&'a Evaluator>) -> Evaluator<'a> {
Evaluator {
variables: HashMap::new(),
parent: parent,
stdout: BufWriter::new(::std::io::stdout()),
trace_evaluation: parent.map_or(false, |e| e.trace_evaluation),
}
}
pub fn run(&mut self, ast: AST) -> Vec<String> {
ast.into_iter()
.map(|astnode| format!("{}", self.reduction_loop(astnode)))
.collect()
}
fn add_binding(&mut self, var: String, value: ReducedValue) {
self.variables.insert(var, value);
}
fn lookup_binding(&self, var: &str) -> Option<ReducedValue> {
match self.variables.get(var) {
Some(expr) => Some(expr.clone()),
None => match self.parent {
Some(env) => env.lookup_binding(var),
None => None
}
}
}
}
trait Evaluable {
fn is_reducible(&self) -> bool;
}
impl Evaluable for Statement {
fn is_reducible(&self) -> bool {
match self {
&ExprNode(ref expr) => expr.is_reducible(),
&FuncDefNode(_) => true,
}
}
}
impl Evaluable for Expression {
fn is_reducible(&self) -> bool {
match *self {
Null => false,
StringLiteral(_) => false,
Lambda(_) => false,
Number(_) => false,
ListLiteral(ref items) => {
items.iter().any(|x| x.is_reducible())
}
StructLiteral(ref items) => {
items.iter().any(|pair| pair.1.is_reducible())
}
_ => true,
}
}
}
impl Expression {
fn is_truthy(&self) -> bool {
match *self {
Null => false,
StringLiteral(ref s) if **s == "" => false,
Number(n) if n == 0.0 => false,
_ => true,
}
}
}
fn is_assignment(op: &BinOp) -> bool {
use self::BinOp::*;
match *op {
Assign | AddAssign | SubAssign |
MulAssign | DivAssign => true,
_ => false,
}
}
impl<'a> Evaluator<'a> {
fn reduction_loop(&mut self, mut node: Statement) -> Statement {
loop {
node = self.step(node);
if !node.is_reducible() {
break;
}
}
node
}
fn step(&mut self, node: Statement) -> Statement {
let mut trace = String::new();
if self.trace_evaluation {
trace.push_str(&format!("Step: {:?}", node));
}
let (new_node, side_effect) = self.reduce_astnode(node);
if self.trace_evaluation {
trace.push_str(&format!("{:?}", new_node));
}
if let Some(s) = side_effect {
if self.trace_evaluation {
trace.push_str(&format!(" | side-effect: {:?}", s));
}
self.perform_side_effect(s);
}
if self.trace_evaluation {
println!("{}", trace);
}
new_node
}
fn perform_side_effect(&mut self, side_effect: SideEffect) {
use self::SideEffect::*;
match side_effect {
Print(s) => {
write!(self.stdout, "{}\n", s).unwrap();
match self.stdout.flush() {
Ok(_) => (),
Err(_) => println!("Could not flush stdout"),
};
}
AddBinding(var, value) => {
self.add_binding((*var).clone(), value);
},
}
}
fn reduce_astnode(&mut self, node: Statement) -> Reduction<Statement> {
match node {
ExprNode(expr) => {
if expr.is_reducible() {
let (new_expr, side_effect) = self.reduce_expr(expr);
(ExprNode(new_expr), side_effect)
} else {
(ExprNode(expr), None)
}
}
FuncDefNode(func) => {
let name = func.prototype.name.clone();
let reduced_value = ReducedValue::Lambda(func.clone());
let binding = Some(SideEffect::AddBinding(name, reduced_value));
(ExprNode(Expression::Lambda(func)), binding)
}
}
}
//TODO I probably want another Expression variant that holds a ReducedValue
fn reduce_expr(&mut self, expression: Expression) -> Reduction<Expression> {
match expression {
Null => (Null, None),
e @ StringLiteral(_) => (e, None),
e @ Number(_) => (e, None),
e @ Lambda(_) => (e, None),
Variable(ref var) => {
match self.lookup_binding(var).map(|x| x.into()) {
None => (Null, None),
Some(expr) => (expr, None),
}
}
BinExp(op, mut left, mut right) => {
if right.is_reducible() {
let mut side_effect = None;
take_mut::take(right.as_mut(), |expr| { let (a, b) = self.reduce_expr(expr); side_effect = b; a});
return (BinExp(op, left, right), side_effect);
}
if let BinOp::Assign = op {
return match *left {
Variable(var) => {
let reduced_value: ReducedValue = ReducedValue::from(*right);
let binding = SideEffect::AddBinding(var, reduced_value);
(Null, Some(binding))
},
_ => (Null, None)
};
}
if is_assignment(&op) {
use self::BinOp::*;
let new_op = match op {
AddAssign => Add,
SubAssign => Sub,
MulAssign => Mul,
DivAssign => Div,
_ => unreachable!(),
};
let reduction =
BinExp(BinOp::Assign,
Box::new(*left.clone()),
Box::new(BinExp(new_op, left, right))
);
return (reduction, None);
}
if left.is_reducible() {
let mut side_effect = None;
take_mut::take(left.as_mut(), |expr| { let (a, b) = self.reduce_expr(expr); side_effect = b; a});
(BinExp(op, left, right), side_effect)
} else {
(self.reduce_binop(op, *left, *right), None) //can assume both arguments are maximally reduced
}
}
Call(callable, mut args) => {
let mut f = true;
for arg in args.iter_mut() {
if arg.is_reducible() {
take_mut::take(arg, |arg| self.reduce_expr(arg).0);
f = false;
break;
}
}
if f {
self.reduce_call(callable, args)
} else {
(Call(callable, args), None)
}
}
While(test, body) => {
let mut block = VecDeque::from(body.clone());
block.push_back(While(test.clone(), body.clone()));
let reduction = Conditional(test, Box::new(Block(block)), None);
(reduction, None)
}
Conditional(box test, then_block, else_block) => {
if test.is_reducible() {
let (new_test, new_effect) = self.reduce_expr(test);
(Conditional(Box::new(new_test), then_block, else_block), new_effect)
} else {
if test.is_truthy() {
(*then_block, None)
} else {
match else_block {
Some(box expr) => (expr, None),
None => (Null, None),
}
}
}
}
Block(mut exprs) => {
let first = exprs.pop_front();
match first {
None => (Null, None),
Some(expr) => {
if exprs.len() == 0 {
(expr, None)
} else {
if expr.is_reducible() {
let (new, side_effect) = self.reduce_expr(expr);
exprs.push_front(new);
(Block(exprs), side_effect)
} else {
(Block(exprs), None)
}
}
}
}
}
Index(mut expr, mut index_expr) => {
if index_expr.is_reducible() {
let mut side_effect = None;
take_mut::take(index_expr.as_mut(), |expr| { let (a, b) = self.reduce_expr(expr); side_effect = b; a});
return (Index(expr, index_expr), side_effect)
}
if expr.is_reducible() {
let mut side_effect = None;
take_mut::take(expr.as_mut(), |expr| { let (a, b) = self.reduce_expr(expr); side_effect = b; a});
return (Index(expr, index_expr), side_effect);
}
match (*expr, *index_expr) {
(ListLiteral(list_items), Number(n)) => {
let indexed_expr = get_indexer(n).and_then(|i| list_items.get(i));
if let Some(e) = indexed_expr {
(e.clone(), None)
} else {
(Null, None)
}
}
(StructLiteral(items), StringLiteral(s)) => {
for item in items {
if s == item.0 {
return (item.1.clone(), None); //TODO this is hella inefficient
}
}
(Null, None)
},
_ => (Null, None)
}
}
ListLiteral(mut exprs) => {
let mut side_effect = None;
for expr in exprs.iter_mut() {
if expr.is_reducible() {
take_mut::take(expr, |expr| {
let (a, b) = self.reduce_expr(expr);
side_effect = b;
a
});
break;
}
}
(ListLiteral(exprs), side_effect)
},
StructLiteral(mut items) => {
let mut side_effect = None;
for pair in items.iter_mut() {
if pair.1.is_reducible() {
take_mut::take(pair, |pair| {
let (name, expr) = pair;
let (a, b) = self.reduce_expr(expr);
side_effect = b;
(name, a)
});
break;
}
}
(StructLiteral(items), side_effect)
}
}
}
fn reduce_binop(&mut self, op: BinOp, left: Expression, right: Expression) -> Expression {
use self::BinOp::*;
let truthy = Number(1.0);
let falsy = Null;
match (op, left, right) {
(Add, Number(l), Number(r)) => Number(l + r),
(Add, StringLiteral(s1), StringLiteral(s2)) => StringLiteral(Rc::new(format!("{}{}", *s1, *s2))),
(Add, StringLiteral(s1), Number(r)) => StringLiteral(Rc::new(format!("{}{}", *s1, r))),
(Add, Number(l), StringLiteral(s1)) => StringLiteral(Rc::new(format!("{}{}", l, *s1))),
(Sub, Number(l), Number(r)) => Number(l - r),
(Mul, Number(l), Number(r)) => Number(l * r),
(Div, Number(l), Number(r)) if r != 0.0 => Number(l / r),
(Mod, Number(l), Number(r)) => Number(l % r),
(Less, Number(l), Number(r)) => if l < r { truthy } else { falsy },
(LessEq, Number(l), Number(r)) => if l <= r { truthy } else { falsy },
(Greater, Number(l), Number(r)) => if l > r { truthy } else { falsy },
(GreaterEq, Number(l), Number(r)) => if l >= r { truthy } else { falsy },
(Equal, Number(l), Number(r)) => if l == r { truthy } else { falsy },
(Equal, Null, Null) => truthy,
(Equal, StringLiteral(s1), StringLiteral(s2)) => if s1 == s2 { truthy } else { falsy },
(Equal, _, _) => falsy,
_ => falsy,
}
}
fn reduce_call(&mut self, callable: Callable, arguments: Vec<Expression>) -> Reduction<Expression> {
if let Some(res) = handle_builtin(&callable, &arguments) {
return res;
}
let function = match callable {
Callable::Lambda(func) => func.clone(),
Callable::NamedFunction(name) => {
match self.lookup_binding(&*name) {
Some(ReducedValue::Lambda(func)) => func,
_ => return (Null, None),
}
}
};
if function.prototype.parameters.len() != arguments.len() {
return (Null, None);
}
let mut evaluator = Evaluator::new(Some(self));
for (binding, expr) in function.prototype.parameters.iter().zip(arguments.iter()) {
evaluator.add_binding((**binding).clone(), expr.clone().into());
}
let nodes = function.body.iter().map(|node| node.clone());
let mut retval = ExprNode(Null);
for n in nodes {
retval = evaluator.reduction_loop(n);
}
match retval {
ExprNode(expr) => (expr, None),
FuncDefNode(_) => panic!("This should never happen! A maximally-reduced node\
should never be a function definition!")
}
}
}
fn handle_builtin(callable: &Callable, arguments: &Vec<Expression>) -> Option<Reduction<Expression>> {
let name: &str = match *callable {
Callable::NamedFunction(ref name) => *&name,
_ => return None,
};
match name {
"print" => {
let mut s = String::new();
for arg in arguments {
s.push_str(&format!("{} ", arg));
}
return Some((Null, Some(SideEffect::Print(s))));
},
_ => None
}
}

78
maaru/src/lib.rs Normal file
View File

@ -0,0 +1,78 @@
#![feature(box_patterns)]
extern crate schala_repl;
mod tokenizer;
mod parser;
mod eval;
#[derive(Debug)]
pub struct TokenError {
pub msg: String,
}
impl TokenError {
pub fn new(msg: &str) -> TokenError {
TokenError { msg: msg.to_string() }
}
}
pub use self::eval::Evaluator as MaaruEvaluator;
pub struct Maaru<'a> {
evaluator: MaaruEvaluator<'a>
}
impl<'a> Maaru<'a> {
pub fn new() -> Maaru<'a> {
Maaru {
evaluator: MaaruEvaluator::new(None),
}
}
}
/*
fn execute_pipeline(&mut self, input: &str, options: &EvalOptions) -> Result<String, String> {
let mut output = UnfinishedComputation::default();
let tokens = match tokenizer::tokenize(input) {
Ok(tokens) => {
if let Some(_) = options.debug_passes.get("tokens") {
output.add_artifact(TraceArtifact::new("tokens", format!("{:?}", tokens)));
}
tokens
},
Err(err) => {
return output.finish(Err(format!("Tokenization error: {:?}\n", err.msg)))
}
};
let ast = match parser::parse(&tokens, &[]) {
Ok(ast) => {
if let Some(_) = options.debug_passes.get("ast") {
output.add_artifact(TraceArtifact::new("ast", format!("{:?}", ast)));
}
ast
},
Err(err) => {
return output.finish(Err(format!("Parse error: {:?}\n", err.msg)))
}
};
let mut evaluation_output = String::new();
for s in self.evaluator.run(ast).iter() {
evaluation_output.push_str(s);
}
Ok(evaluation_output)
}
*/
/*
impl<'a> ProgrammingLanguageInterface for Maaru<'a> {
fn get_language_name(&self) -> String {
"Maaru".to_string()
}
fn get_source_file_suffix(&self) -> String {
format!("maaru")
}
}
*/

755
maaru/src/parser.rs Normal file
View File

@ -0,0 +1,755 @@
use tokenizer::{Token, Kw, OpTok};
use tokenizer::Token::*;
use std::fmt;
use std::collections::VecDeque;
use std::rc::Rc;
use std::convert::From;
// Grammar
// program := (statement delimiter ?)*
// delimiter := Newline | Semicolon
// statement := declaration | expression
// declaration := FN prototype LCurlyBrace (statement)* RCurlyBrace
// prototype := identifier LParen identlist RParen
// identlist := Ident (Comma Ident)* | ε
// exprlist := Expression (Comma Expression)* | ε
// itemlist := Ident COLON Expression (Comma Ident COLON Expression)* | ε
//
// expression := postop_expression (op postop_expression)*
// postop_expression := primary_expression postop
// primary_expression := number_expr | String | identifier_expr | paren_expr | conditional_expr | while_expr | lambda_expr | list_expr | struct_expr
// number_expr := (PLUS | MINUS ) number_expr | Number
// identifier_expr := call_expression | Variable
// list_expr := LSquareBracket exprlist RSquareBracket
// struct_expr := LCurlyBrace itemlist RCurlyBrace
// call_expression := Identifier LParen exprlist RParen
// while_expr := WHILE primary_expression LCurlyBrace (expression delimiter)* RCurlyBrace
// paren_expr := LParen expression RParen
// conditional_expr := IF expression LCurlyBrace (expression delimiter)* RCurlyBrace (LCurlyBrace (expresion delimiter)* RCurlyBrace)?
// lambda_expr := FN LParen identlist RParen LCurlyBrace (expression delimiter)* RCurlyBrace
// lambda_call := | LParen exprlist RParen
// postop := ε | LParen exprlist RParen | LBracket expression RBracket
// op := '+', '-', etc.
//
pub type AST = Vec<Statement>;
#[derive(Debug, Clone)]
pub enum Statement {
ExprNode(Expression),
FuncDefNode(Function),
}
impl fmt::Display for Statement {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::Statement::*;
match *self {
ExprNode(ref expr) => write!(f, "{}", expr),
FuncDefNode(_) => write!(f, "UNIMPLEMENTED"),
}
}
}
#[derive(Debug, Clone)]
pub struct Function {
pub prototype: Prototype,
pub body: Vec<Statement>,
}
#[derive(Debug, Clone, PartialEq)]
pub struct Prototype {
pub name: Rc<String>,
pub parameters: Vec<Rc<String>>,
}
#[derive(Debug, Clone)]
pub enum Expression {
Null,
StringLiteral(Rc<String>),
Number(f64),
Variable(Rc<String>),
BinExp(BinOp, Box<Expression>, Box<Expression>),
Call(Callable, Vec<Expression>),
Conditional(Box<Expression>, Box<Expression>, Option<Box<Expression>>),
Lambda(Function),
Block(VecDeque<Expression>),
While(Box<Expression>, Vec<Expression>),
Index(Box<Expression>, Box<Expression>),
ListLiteral(VecDeque<Expression>),
StructLiteral(VecDeque<(Rc<String>, Expression)>),
}
#[derive(Clone, Debug)]
pub enum Callable {
NamedFunction(Rc<String>),
Lambda(Function),
}
//TODO this ought to be ReducedExpression
impl fmt::Display for Expression {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::Expression::*;
match *self {
Null => write!(f, "null"),
StringLiteral(ref s) => write!(f, "\"{}\"", s),
Number(n) => write!(f, "{}", n),
Lambda(Function { prototype: Prototype { ref name, ref parameters, .. }, .. }) => {
write!(f, "«function: {}, {} arg(s)»", name, parameters.len())
}
ListLiteral(ref items) => {
write!(f, "[ ")?;
let mut iter = items.iter().peekable();
while let Some(item) = iter.next() {
write!(f, "{}", item)?;
if let Some(_) = iter.peek() {
write!(f, ", ")?;
}
}
write!(f, " ]")
}
StructLiteral(ref items) => {
write!(f, "{} ", "{")?;
let mut iter = items.iter().peekable();
while let Some(pair) = iter.next() {
write!(f, "{}: {}", pair.0, pair.1)?;
if let Some(_) = iter.peek() {
write!(f, ", ")?;
}
}
write!(f, "{} ", "}")
}
_ => write!(f, "UNIMPLEMENTED"),
}
}
}
#[derive(Debug, Clone)]
pub enum BinOp {
Add,
AddAssign,
Sub,
SubAssign,
Mul,
MulAssign,
Div,
DivAssign,
Mod,
Less,
LessEq,
Greater,
GreaterEq,
Equal,
Assign,
Custom(String),
}
impl From<OpTok> for BinOp {
fn from(token: OpTok) -> BinOp {
use self::BinOp::*;
match &token.0[..] {
"+" => Add,
"+=" => AddAssign,
"-" => Sub,
"-=" => SubAssign,
"*" => Mul,
"*=" => MulAssign,
"/" => Div,
"/=" => DivAssign,
"%" => Mod,
"<" => Less,
"<=" => LessEq,
">" => Greater,
">=" => GreaterEq,
"==" => Equal,
"=" => Assign,
op => Custom(op.to_string()),
}
}
}
type Precedence = u8;
// TODO make this support incomplete parses
pub type ParseResult<T> = Result<T, ParseError>;
#[derive(Debug)]
pub struct ParseError {
pub msg: String,
pub remaining_tokens: Vec<Token>,
}
impl ParseError {
fn result_from_str<T>(msg: &str) -> ParseResult<T> {
Err(ParseError {
msg: msg.to_string(),
remaining_tokens: vec![],
})
}
}
struct Parser {
tokens: Vec<Token>,
}
impl Parser {
fn initialize(tokens: &[Token]) -> Parser {
let mut tokens = tokens.to_vec();
tokens.reverse();
Parser { tokens: tokens }
}
fn peek(&self) -> Option<Token> {
self.tokens.last().map(|x| x.clone())
}
fn next(&mut self) -> Option<Token> {
self.tokens.pop()
}
fn get_precedence(&self, op: &OpTok) -> Precedence {
match &op.0[..] {
"+" => 10,
"-" => 10,
"*" => 20,
"/" => 20,
"%" => 20,
"==" => 40,
"=" | "+=" | "-=" | "*=" | "/=" => 1,
">" | ">=" | "<" | "<=" => 30,
_ => 255,
}
}
}
macro_rules! expect {
($self_:expr, $token:pat) => {
match $self_.peek() {
Some($token) => {$self_.next();},
Some(x) => {
let err = format!("Expected `{:?}` but got `{:?}`", stringify!($token), x);
return ParseError::result_from_str(&err)
},
None => {
let err = format!("Expected `{:?}` but got end of input", stringify!($token));
return ParseError::result_from_str(&err) //TODO make this not require 2 stringifications
}
}
}
}
macro_rules! expect_identifier {
($self_:expr) => {
match $self_.peek() {
Some(Identifier(s)) => {$self_.next(); s},
Some(x) => return ParseError::result_from_str(&format!("Expected identifier, but got {:?}", x)),
None => return ParseError::result_from_str("Expected identifier, but got end of input"),
}
}
}
macro_rules! skip_whitespace {
($_self: expr) => {
loop {
match $_self.peek() {
Some(ref t) if is_delimiter(t) => {
$_self.next();
continue;
}
_ => break,
}
}
}
}
macro_rules! delimiter_block {
($_self: expr, $try_parse: ident, $($break_pattern: pat)|+) => {
{
let mut acc = Vec::new();
loop {
match $_self.peek() {
None => break,
Some(ref t) if is_delimiter(t) => { $_self.next(); continue; },
$($break_pattern)|+ => break,
_ => {
let a = try!($_self.$try_parse());
acc.push(a);
}
}
}
acc
}
}
}
fn is_delimiter(token: &Token) -> bool {
match *token {
Newline | Semicolon => true,
_ => false,
}
}
impl Parser {
fn program(&mut self) -> ParseResult<AST> {
let mut ast = Vec::new(); //TODO have this come from previously-parsed tree
loop {
let result: ParseResult<Statement> = match self.peek() {
Some(ref t) if is_delimiter(t) => {
self.next();
continue;
}
Some(_) => self.statement(),
None => break,
};
match result {
Ok(node) => ast.push(node),
Err(mut err) => {
err.remaining_tokens = self.tokens.clone();
err.remaining_tokens.reverse();
return Err(err);
}
}
}
Ok(ast)
}
fn statement(&mut self) -> ParseResult<Statement> {
let node: Statement = match self.peek() {
Some(Keyword(Kw::Fn)) => self.declaration()?,
Some(_) => Statement::ExprNode(self.expression()?),
None => panic!("Unexpected end of tokens"),
};
Ok(node)
}
fn declaration(&mut self) -> ParseResult<Statement> {
expect!(self, Keyword(Kw::Fn));
let prototype = self.prototype()?;
expect!(self, LCurlyBrace);
let body = self.body()?;
expect!(self, RCurlyBrace);
Ok(Statement::FuncDefNode(Function {
prototype: prototype,
body: body,
}))
}
fn prototype(&mut self) -> ParseResult<Prototype> {
let name = expect_identifier!(self);
expect!(self, LParen);
let parameters = self.identlist()?;
expect!(self, RParen);
Ok(Prototype {
name: name,
parameters: parameters,
})
}
fn identlist(&mut self) -> ParseResult<Vec<Rc<String>>> {
let mut args = Vec::new();
while let Some(Identifier(name)) = self.peek() {
args.push(name.clone());
self.next();
match self.peek() {
Some(Comma) => {self.next();},
_ => break,
}
}
Ok(args)
}
fn exprlist(&mut self) -> ParseResult<Vec<Expression>> {
let mut exprs = Vec::new();
loop {
if let Some(RParen) = self.peek() {
break;
}
let exp = self.expression()?;
exprs.push(exp);
match self.peek() {
Some(Comma) => {self.next();},
_ => break,
}
}
Ok(exprs)
}
fn itemlist(&mut self) -> ParseResult<VecDeque<(Rc<String>, Expression)>> {
let mut items = VecDeque::new();
loop {
if let Some(RCurlyBrace) = self.peek() {
break;
}
let name = expect_identifier!(self);
expect!(self, Colon);
let expr = self.expression()?;
items.push_back((name, expr));
match self.peek() {
Some(Comma) => {self.next();},
_ => break,
};
}
Ok(items)
}
fn body(&mut self) -> ParseResult<Vec<Statement>> {
let statements = delimiter_block!(
self,
statement,
Some(RCurlyBrace)
);
Ok(statements)
}
fn expression(&mut self) -> ParseResult<Expression> {
let lhs: Expression = self.postop_expression()?;
self.precedence_expr(lhs, 0)
}
fn precedence_expr(&mut self,
mut lhs: Expression,
min_precedence: u8)
-> ParseResult<Expression> {
while let Some(Operator(op)) = self.peek() {
let precedence = self.get_precedence(&op);
if precedence < min_precedence {
break;
}
self.next();
let mut rhs = self.postop_expression()?;
while let Some(Operator(ref op)) = self.peek() {
if self.get_precedence(op) > precedence {
let new_prec = self.get_precedence(op);
rhs = self.precedence_expr(rhs, new_prec)?;
} else {
break;
}
}
lhs = Expression::BinExp(op.into(), Box::new(lhs), Box::new(rhs));
}
Ok(lhs)
}
fn postop_expression(&mut self) -> ParseResult<Expression> {
use self::Expression::*;
let expr = self.primary_expression()?;
let ret = match self.peek() {
Some(LParen) => {
let args = self.call_expression()?;
match expr {
Lambda(f) => Call(Callable::Lambda(f), args),
e => {
let err = format!("Expected lambda expression before a call, got {:?}", e);
return ParseError::result_from_str(&err);
},
}
},
Some(LSquareBracket) => {
expect!(self, LSquareBracket);
let index_expr = self.expression()?;
expect!(self, RSquareBracket);
Index(Box::new(expr), Box::new(index_expr))
},
_ => {
expr
}
};
Ok(ret)
}
fn primary_expression(&mut self) -> ParseResult<Expression> {
Ok(match self.peek() {
Some(Keyword(Kw::Null)) => {
self.next();
Expression::Null
}
Some(NumLiteral(_)) => self.number_expression()?,
Some(Operator(OpTok(ref a))) if **a == "+" || **a == "-" => self.number_expression()?,
Some(StrLiteral(s)) => {
self.next();
Expression::StringLiteral(s)
}
Some(Keyword(Kw::If)) => self.conditional_expr()?,
Some(Keyword(Kw::While)) => self.while_expr()?,
Some(Identifier(_)) => self.identifier_expr()?,
Some(Token::LParen) => self.paren_expr()?,
Some(Keyword(Kw::Fn)) => self.lambda_expr()?,
Some(Token::LSquareBracket) => self.list_expr()?,
Some(Token::LCurlyBrace) => self.struct_expr()?,
Some(e) => {
return ParseError::result_from_str(&format!("Expected primary expression, got \
{:?}",
e));
}
None => return ParseError::result_from_str("Expected primary expression received EoI"),
})
}
fn list_expr(&mut self) -> ParseResult<Expression> {
expect!(self, LSquareBracket);
let exprlist: Vec<Expression> = self.exprlist()?;
expect!(self, RSquareBracket);
Ok(Expression::ListLiteral(VecDeque::from(exprlist)))
}
fn struct_expr(&mut self) -> ParseResult<Expression> {
expect!(self, LCurlyBrace);
let struct_items = self.itemlist()?;
expect!(self, RCurlyBrace);
Ok(Expression::StructLiteral(struct_items))
}
fn number_expression(&mut self) -> ParseResult<Expression> {
let mut multiplier = 1;
loop {
match self.peek() {
Some(NumLiteral(n)) => {
self.next();
return Ok(Expression::Number(n * multiplier as f64));
}
Some(Operator(OpTok(ref a))) if **a == "+" => {
self.next();
}
Some(Operator(OpTok(ref a))) if **a == "-" => {
multiplier *= -1;
self.next();
}
Some(e) => {
return ParseError::result_from_str(
&format!("Expected +, - or number, got {:?}", e));
}
None => {
return ParseError::result_from_str(
&format!("Expected +, - or number, got EoI"));
}
}
}
}
fn lambda_expr(&mut self) -> ParseResult<Expression> {
use self::Expression::*;
expect!(self, Keyword(Kw::Fn));
skip_whitespace!(self);
expect!(self, LParen);
let parameters = self.identlist()?;
expect!(self, RParen);
skip_whitespace!(self);
expect!(self, LCurlyBrace);
let body = self.body()?;
expect!(self, RCurlyBrace);
let prototype = Prototype {
name: Rc::new("a lambda yo!".to_string()),
parameters: parameters,
};
let function = Function {
prototype: prototype,
body: body,
};
Ok(Lambda(function))
}
fn while_expr(&mut self) -> ParseResult<Expression> {
use self::Expression::*;
expect!(self, Keyword(Kw::While));
let test = self.expression()?;
expect!(self, LCurlyBrace);
let body = delimiter_block!(
self,
expression,
Some(RCurlyBrace)
);
expect!(self, RCurlyBrace);
Ok(While(Box::new(test), body))
}
fn conditional_expr(&mut self) -> ParseResult<Expression> {
use self::Expression::*;
expect!(self, Keyword(Kw::If));
let test = self.expression()?;
skip_whitespace!(self);
expect!(self, LCurlyBrace);
skip_whitespace!(self);
let then_block = delimiter_block!(
self,
expression,
Some(RCurlyBrace)
);
expect!(self, RCurlyBrace);
skip_whitespace!(self);
let else_block = if let Some(Keyword(Kw::Else)) = self.peek() {
self.next();
skip_whitespace!(self);
expect!(self, LCurlyBrace);
let else_exprs = delimiter_block!(
self,
expression,
Some(RCurlyBrace)
);
Some(else_exprs)
} else {
None
};
expect!(self, RCurlyBrace);
Ok(Conditional(Box::new(test),
Box::new(Block(VecDeque::from(then_block))),
else_block.map(|list| Box::new(Block(VecDeque::from(list))))))
}
fn identifier_expr(&mut self) -> ParseResult<Expression> {
let name = expect_identifier!(self);
let expr = match self.peek() {
Some(LParen) => {
let args = self.call_expression()?;
Expression::Call(Callable::NamedFunction(name), args)
}
__ => Expression::Variable(name),
};
Ok(expr)
}
fn call_expression(&mut self) -> ParseResult<Vec<Expression>> {
expect!(self, LParen);
let args: Vec<Expression> = self.exprlist()?;
expect!(self, RParen);
Ok(args)
}
fn paren_expr(&mut self) -> ParseResult<Expression> {
expect!(self, Token::LParen);
let expr = self.expression()?;
expect!(self, Token::RParen);
Ok(expr)
}
}
pub fn parse(tokens: &[Token], _parsed_tree: &[Statement]) -> ParseResult<AST> {
let mut parser = Parser::initialize(tokens);
parser.program()
}
/*
#[cfg(test)]
mod tests {
use schala_lang::tokenizer;
use super::*;
use super::Statement::*;
use super::Expression::*;
macro_rules! parsetest {
($input:expr, $output:pat, $ifexpr:expr) => {
{
let tokens = tokenizer::tokenize($input).unwrap();
let ast = parse(&tokens, &[]).unwrap();
match &ast[..] {
$output if $ifexpr => (),
x => panic!("Error in parse test, got {:?} instead", x)
}
}
}
}
#[test]
fn function_parse_test() {
use super::Function;
parsetest!(
"fn a() { 1 + 2 }",
&[FuncDefNode(Function {prototype: Prototype { ref name, ref parameters }, ref body})],
match &body[..] { &[ExprNode(BinExp(_, box Number(1.0), box Number(2.0)))] => true, _ => false }
&& **name == "a" && match &parameters[..] { &[] => true, _ => false }
);
parsetest!(
"fn a(x,y){ 1 + 2 }",
&[FuncDefNode(Function {prototype: Prototype { ref name, ref parameters }, ref body})],
match &body[..] { &[ExprNode(BinExp(_, box Number(1.0), box Number(2.0)))] => true, _ => false }
&& **name == "a" && *parameters[0] == "x" && *parameters[1] == "y" && parameters.len() == 2
);
let t3 = "fn (x) { x + 2 }";
let tokens3 = tokenizer::tokenize(t3).unwrap();
assert!(parse(&tokens3, &[]).is_err());
}
#[test]
fn expression_parse_test() {
parsetest!("a", &[ExprNode(Variable(ref s))], **s == "a");
parsetest!("a + b",
&[ExprNode(BinExp(BinOp::Add, box Variable(ref a), box Variable(ref b)))],
**a == "a" && **b == "b");
parsetest!("a + b * c",
&[ExprNode(BinExp(BinOp::Add, box Variable(ref a), box BinExp(BinOp::Mul, box Variable(ref b), box Variable(ref c))))],
**a == "a" && **b == "b" && **c == "c");
parsetest!("a * b + c",
&[ExprNode(BinExp(BinOp::Add, box BinExp(BinOp::Mul, box Variable(ref a), box Variable(ref b)), box Variable(ref c)))],
**a == "a" && **b == "b" && **c == "c");
parsetest!("(a + b) * c",
&[ExprNode(BinExp(BinOp::Mul, box BinExp(BinOp::Add, box Variable(ref a), box Variable(ref b)), box Variable(ref c)))],
**a == "a" && **b == "b" && **c == "c");
}
#[test]
fn lambda_parse_test() {
use schala_lang::tokenizer;
let t1 = "(fn(x) { x + 2 })";
let tokens1 = tokenizer::tokenize(t1).unwrap();
match parse(&tokens1, &[]).unwrap()[..] {
_ => (),
}
let t2 = "fn(x) { x + 2 }";
let tokens2 = tokenizer::tokenize(t2).unwrap();
assert!(parse(&tokens2, &[]).is_err());
let t3 = "(fn(x) { x + 10 })(20)";
let tokens3 = tokenizer::tokenize(t3).unwrap();
match parse(&tokens3, &[]).unwrap() {
_ => (),
};
}
#[test]
fn conditional_parse_test() {
use schala_lang::tokenizer;
let t1 = "if null { 20 } else { 40 }";
let tokens = tokenizer::tokenize(t1).unwrap();
match parse(&tokens, &[]).unwrap()[..] {
[ExprNode(Conditional(box Null, box Block(_), Some(box Block(_))))] => (),
_ => panic!(),
}
let t2 = r"
if null {
20
} else {
40
}
";
let tokens2 = tokenizer::tokenize(t2).unwrap();
match parse(&tokens2, &[]).unwrap()[..] {
[ExprNode(Conditional(box Null, box Block(_), Some(box Block(_))))] => (),
_ => panic!(),
}
let t2 = r"
if null {
20 } else
{
40
}
";
let tokens3 = tokenizer::tokenize(t2).unwrap();
match parse(&tokens3, &[]).unwrap()[..] {
[ExprNode(Conditional(box Null, box Block(_), Some(box Block(_))))] => (),
_ => panic!(),
}
}
}
*/

208
maaru/src/tokenizer.rs Normal file
View File

@ -0,0 +1,208 @@
extern crate itertools;
use std::iter::Peekable;
use std::str::Chars;
use self::itertools::Itertools;
use std::rc::Rc;
use TokenError;
#[derive(Debug, Clone, PartialEq)]
pub enum Token {
Newline,
Semicolon,
LParen,
RParen,
LSquareBracket,
RSquareBracket,
LCurlyBrace,
RCurlyBrace,
Comma,
Period,
Colon,
NumLiteral(f64),
StrLiteral(Rc<String>),
Identifier(Rc<String>),
Operator(OpTok),
Keyword(Kw),
}
#[derive(Debug, Clone, PartialEq)]
pub struct OpTok(pub Rc<String>);
#[derive(Debug, Clone, PartialEq)]
pub enum Kw {
If,
Else,
While,
Let,
Fn,
Null,
}
pub type TokenizeResult = Result<Vec<Token>, TokenError>;
fn is_digit(c: &char) -> bool {
c.is_digit(10)
}
pub fn tokenize(input: &str) -> TokenizeResult {
use self::Token::*;
let mut tokens = Vec::new();
let mut iter: Peekable<Chars> = input.chars().peekable();
while let Some(c) = iter.next() {
if c == '#' {
while let Some(c) = iter.next() {
if c == '\n' {
break;
}
}
continue;
}
let cur_tok = match c {
c if char::is_whitespace(c) && c != '\n' => continue,
'\n' => Newline,
';' => Semicolon,
'(' => LParen,
')' => RParen,
':' => Colon,
',' => Comma,
'{' => LCurlyBrace,
'}' => RCurlyBrace,
'[' => LSquareBracket,
']' => RSquareBracket,
'"' => tokenize_str(&mut iter)?,
c if !char::is_alphanumeric(c) => tokenize_operator(c, &mut iter)?,
c @ '.' | c if is_digit(&c) => tokenize_number_or_period(c, &mut iter)?,
c => tokenize_identifier(c, &mut iter)?,
};
tokens.push(cur_tok);
}
Ok(tokens)
}
fn tokenize_str(iter: &mut Peekable<Chars>) -> Result<Token, TokenError> {
let mut buffer = String::new();
loop {
// TODO handle string escapes, interpolation
match iter.next() {
Some(x) if x == '"' => break,
Some(x) => buffer.push(x),
None => return Err(TokenError::new("Unclosed quote")),
}
}
Ok(Token::StrLiteral(Rc::new(buffer)))
}
fn tokenize_operator(c: char, iter: &mut Peekable<Chars>) -> Result<Token, TokenError> {
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| !char::is_alphanumeric(*x) && !char::is_whitespace(*x)));
Ok(Token::Operator(OpTok(Rc::new(buffer))))
}
fn tokenize_number_or_period(c: char, iter: &mut Peekable<Chars>) -> Result<Token, TokenError> {
if c == '.' && !iter.peek().map_or(false, is_digit) {
return Ok(Token::Period);
}
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| is_digit(x) || *x == '.'));
match buffer.parse::<f64>() {
Ok(f) => Ok(Token::NumLiteral(f)),
Err(_) => Err(TokenError::new("Failed to parse digit")),
}
}
fn tokenize_identifier(c: char, iter: &mut Peekable<Chars>) -> Result<Token, TokenError> {
fn ends_identifier(c: &char) -> bool {
let c = *c;
char::is_whitespace(c) || is_digit(&c) || c == ';' || c == '(' || c == ')' ||
c == ',' || c == '.' || c == ',' || c == ':' || c == '[' || c == ']'
}
use self::Token::*;
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| !ends_identifier(x)));
Ok(match &buffer[..] {
"if" => Keyword(Kw::If),
"else" => Keyword(Kw::Else),
"while" => Keyword(Kw::While),
"let" => Keyword(Kw::Let),
"fn" => Keyword(Kw::Fn),
"null" => Keyword(Kw::Null),
b => Identifier(Rc::new(b.to_string())),
})
}
/*
#[cfg(test)]
mod tests {
use super::*;
use super::Token::*;
macro_rules! token_test {
($input: expr, $output: pat, $ifexpr: expr) => {
let tokens = tokenize($input).unwrap();
match tokens[..] {
$output if $ifexpr => (),
_ => panic!("Actual output: {:?}", tokens),
}
}
}
#[test]
fn basic_tokeniziation_tests() {
token_test!("let a = 3\n",
[Keyword(Kw::Let), Identifier(ref a), Operator(OpTok(ref b)), NumLiteral(3.0), Newline],
**a == "a" && **b == "=");
token_test!("2+1",
[NumLiteral(2.0), Operator(OpTok(ref a)), NumLiteral(1.0)],
**a == "+");
token_test!("2 + 1",
[NumLiteral(2.0), Operator(OpTok(ref a)), NumLiteral(1.0)],
**a == "+");
token_test!("2.3*49.2",
[NumLiteral(2.3), Operator(OpTok(ref a)), NumLiteral(49.2)],
**a == "*");
token_test!("a+3",
[Identifier(ref a), NumLiteral(3.0)],
**a == "a+");
assert!(tokenize("2.4.5").is_err());
token_test!("fn my_func(a) { a ? 3[1] }",
[Keyword(Kw::Fn), Identifier(ref a), LParen, Identifier(ref b), RParen, LCurlyBrace, Identifier(ref c),
Operator(OpTok(ref d)), NumLiteral(3.0), LSquareBracket, NumLiteral(1.0), RSquareBracket, RCurlyBrace],
**a == "my_func" && **b == "a" && **c == "a" && **d == "?");
}
#[test]
fn string_test() {
token_test!("null + \"a string\"",
[Keyword(Kw::Null), Operator(OpTok(ref a)), StrLiteral(ref b)],
**a == "+" && **b == "a string");
token_test!("\"{?'q@?\"",
[StrLiteral(ref a)],
**a == "{?'q@?");
}
#[test]
fn operator_test() {
token_test!("a *> b",
[Identifier(ref a), Operator(OpTok(ref b)), Identifier(ref c)],
**a == "a" && **b == "*>" && **c == "b");
}
}
*/

11
robo/Cargo.toml Normal file
View File

@ -0,0 +1,11 @@
[package]
name = "robo-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
itertools = "0.5.8"
take_mut = "0.1.3"
llvm-sys = "*"
schala-repl = { path = "../schala-repl" }

158
robo/src/lib.rs Normal file
View File

@ -0,0 +1,158 @@
#![feature(box_patterns)]
extern crate itertools;
extern crate schala_repl;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions};
pub struct Robo {
}
impl Robo {
pub fn new() -> Robo {
Robo { }
}
}
#[derive(Debug)]
pub struct TokenError {
pub msg: String,
}
impl TokenError {
pub fn new(msg: &str) -> TokenError {
TokenError { msg: msg.to_string() }
}
}
#[allow(dead_code)]
#[derive(Debug)]
pub enum Token {
StrLiteral(String),
Backtick,
Newline,
LParen,
RParen,
LBracket,
RBracket,
LBrace,
RBrace,
Period,
Comma,
Colon,
Semicolon,
SingleQuote,
Identifier(String),
Operator(String),
NumLiteral(Number),
}
#[allow(dead_code)]
#[derive(Debug)]
pub enum Number {
IntegerRep(String),
FloatRep(String)
}
#[allow(dead_code)]
pub type AST = Vec<ASTNode>;
#[allow(dead_code)]
#[derive(Debug)]
pub enum ASTNode {
FunctionDefinition(String, Expression),
ImportStatement(String),
}
#[allow(dead_code)]
#[derive(Debug)]
pub enum Expression {
}
fn tokenize(input: &str) -> Result<Vec<Token>, TokenError> {
use self::Token::*;
let mut tokens = Vec::new();
let mut iter = input.chars().peekable();
while let Some(c) = iter.next() {
if c == ';' {
while let Some(c) = iter.next() {
if c == '\n' {
break;
}
}
continue;
}
let cur_tok = match c {
c if char::is_whitespace(c) && c != '\n' => continue,
'\n' => Newline,
'(' => LParen,
')' => RParen,
'[' => LBracket,
']' => RBracket,
'{' => LBrace,
'}' => RBrace,
',' => Comma,
':' => Colon,
';' => Semicolon,
'.' => Period,
'`' => Backtick,
'\'' => SingleQuote,
'"' => {
let mut buffer = String::new();
loop {
match iter.next() {
Some(x) if x == '"' => break,
Some(x) => buffer.push(x),
None => return Err(TokenError::new("Unclosed quote")),
}
}
StrLiteral(buffer)
}
c if c.is_digit(10) => {
let mut integer = true;
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| x.is_digit(10)));
if let Some(&'.') = iter.peek() {
buffer.push(iter.next().unwrap());
integer = false;
}
buffer.extend(iter.peeking_take_while(|x| x.is_digit(10)));
let inner = if integer {
Number::IntegerRep(buffer)
} else {
Number::FloatRep(buffer)
};
NumLiteral(inner)
},
c if char::is_alphanumeric(c) => {
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| char::is_alphanumeric(*x)));
Identifier(buffer)
},
c => {
let mut buffer = String::new();
buffer.push(c);
buffer.extend(iter.peeking_take_while(|x| !char::is_whitespace(*x)));
Operator(buffer)
}
};
tokens.push(cur_tok);
}
Ok(tokens)
}
impl ProgrammingLanguageInterface for Robo {
fn get_language_name(&self) -> String {
"Robo".to_string()
}
fn get_source_file_suffix(&self) -> String {
format!("robo")
}
}

11
rukka/Cargo.toml Normal file
View File

@ -0,0 +1,11 @@
[package]
name = "rukka-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
itertools = "0.5.8"
take_mut = "0.1.3"
llvm-sys = "*"
schala-repl = { path = "../schala-repl" }

417
rukka/src/lib.rs Normal file
View File

@ -0,0 +1,417 @@
#![feature(box_patterns)]
extern crate itertools;
extern crate schala_repl;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions};
use std::iter::Peekable;
use std::vec::IntoIter;
use std::str::Chars;
use std::collections::HashMap;
pub struct EvaluatorState {
binding_stack: Vec<HashMap<String, Sexp>>
}
impl EvaluatorState {
fn new() -> EvaluatorState {
use self::Sexp::Primitive;
use self::PrimitiveFn::*;
let mut default_map = HashMap::new();
default_map.insert(format!("+"), Primitive(Plus));
default_map.insert(format!("-"), Primitive(Minus));
default_map.insert(format!("*"), Primitive(Mult));
default_map.insert(format!("/"), Primitive(Div));
default_map.insert(format!("%"), Primitive(Mod));
default_map.insert(format!(">"), Primitive(Greater));
default_map.insert(format!("<"), Primitive(Less));
default_map.insert(format!("<="), Primitive(LessThanOrEqual));
default_map.insert(format!(">="), Primitive(GreaterThanOrEqual));
default_map.insert(format!("display"), Primitive(Display));
EvaluatorState {
binding_stack: vec![default_map],
}
}
fn set_var(&mut self, var: String, value: Sexp) {
let binding = self.binding_stack.last_mut().unwrap();
binding.insert(var, value);
}
fn get_var(&self, var: &str) -> Option<&Sexp> {
for bindings in self.binding_stack.iter().rev() {
match bindings.get(var) {
Some(x) => return Some(x),
None => (),
}
}
None
}
fn push_env(&mut self) {
self.binding_stack.push(HashMap::new());
}
fn pop_env(&mut self) {
self.binding_stack.pop();
}
}
pub struct Rukka {
state: EvaluatorState
}
impl Rukka {
pub fn new() -> Rukka { Rukka { state: EvaluatorState::new() } }
}
impl ProgrammingLanguageInterface for Rukka {
fn get_language_name(&self) -> String {
"Rukka".to_string()
}
fn get_source_file_suffix(&self) -> String {
format!("rukka")
}
}
impl EvaluatorState {
fn eval(&mut self, expr: Sexp) -> Result<Sexp, String> {
use self::Sexp::*;
Ok(match expr {
SymbolAtom(ref sym) => match self.get_var(sym) {
Some(ref sexp) => {
let q: &Sexp = sexp; //WTF? if I delete this line, the copy doesn't work??
q.clone() //TODO make this not involve a clone
},
None => return Err(format!("Variable {} not bound", sym)),
},
expr @ Primitive(_) => expr,
expr @ FnLiteral { .. } => expr,
expr @ StringAtom(_) => expr,
expr @ NumberAtom(_) => expr,
expr @ BoolAtom(_) => expr,
Cons(box operator, box operands) => match operator {
SymbolAtom(ref sym) if match &sym[..] {
"quote" | "eq?" | "cons" | "car" | "cdr" | "atom?" | "define" | "lambda" | "if" | "cond" => true, _ => false
} => self.eval_special_form(sym, operands)?,
_ => {
let evaled = self.eval(operator)?;
self.apply(evaled, operands)?
}
},
Nil => Nil,
})
}
fn eval_special_form(&mut self, form: &str, operands: Sexp) -> Result<Sexp, String> {
use self::Sexp::*;
Ok(match form {
"quote" => match operands {
Cons(box quoted, box Nil) => quoted,
_ => return Err(format!("Bad syntax in quote")),
},
"eq?" => match operands {//TODO make correct
Cons(box lhs, box Cons(box rhs, _)) => BoolAtom(lhs == rhs),
_ => BoolAtom(true),
},
"cons" => match operands {
Cons(box cadr, box Cons(box caddr, box Nil)) => {
let newl = self.eval(cadr)?;
let newr = self.eval(caddr)?;
Cons(Box::new(newl), Box::new(newr))
},
_ => return Err(format!("Bad arguments for cons")),
},
"car" => match operands {
Cons(box car, _) => car,
_ => return Err(format!("called car with a non-pair argument")),
},
"cdr" => match operands {
Cons(_, box cdr) => cdr,
_ => return Err(format!("called cdr with a non-pair argument")),
},
"atom?" => match operands {
Cons(_, _) => BoolAtom(false),
_ => BoolAtom(true),
},
"define" => match operands {
Cons(box SymbolAtom(sym), box Cons(box expr, box Nil)) => {
let evaluated = self.eval(expr)?;
self.set_var(sym, evaluated);
Nil
},
_ => return Err(format!("Bad assignment")),
}
"lambda" => match operands {
Cons(box mut paramlist, box Cons(box formalexp, box Nil)) => {
let mut formal_params = vec![];
{
let mut ptr = &paramlist;
loop {
match ptr {
&Cons(ref arg, ref rest) => {
if let SymbolAtom(ref sym) = **arg {
formal_params.push(sym.clone());
ptr = rest;
} else {
return Err(format!("Bad lambda format"));
}
},
_ => break,
}
}
}
FnLiteral {
formal_params,
body: Box::new(formalexp)
}
},
_ => return Err(format!("Bad lambda expression")),
},
"if" => match operands {
Cons(box test, box body) => {
let truth_value = test.truthy();
match (truth_value, body) {
(true, Cons(box consequent, _)) => consequent,
(false, Cons(_, box Cons(box alternative, _))) => alternative,
_ => return Err(format!("Bad if expression"))
}
},
_ => return Err(format!("Bad if expression"))
},
s => return Err(format!("Non-existent special form {}; this should never happen", s)),
})
}
fn apply(&mut self, function: Sexp, operands: Sexp) -> Result<Sexp, String> {
use self::Sexp::*;
match function {
FnLiteral { formal_params, body } => {
self.push_env();
let mut cur = operands;
for param in formal_params {
match cur {
Cons(box arg, box rest) => {
cur = rest;
self.set_var(param, arg);
},
_ => return Err(format!("Bad argument for function application")),
}
}
let result = self.eval(*body);
self.pop_env();
result
},
Primitive(prim) => {
let mut evaled_operands = Vec::new();
let mut cur_operand = operands;
loop {
match cur_operand {
Nil => break,
Cons(box l, box rest) => {
evaled_operands.push(self.eval(l)?);
cur_operand = rest;
},
_ => return Err(format!("Bad operands list"))
}
}
prim.apply(evaled_operands)
}
_ => return Err(format!("Bad type to apply")),
}
}
}
fn read(input: &str) -> Result<Vec<Sexp>, String> {
let mut chars: Peekable<Chars> = input.chars().peekable();
let mut tokens = tokenize(&mut chars).into_iter().peekable();
let mut sexps = Vec::new();
while let Some(_) = tokens.peek() {
sexps.push(parse(&mut tokens)?);
}
Ok(sexps)
}
#[derive(Debug)]
enum Token {
LParen,
RParen,
Quote,
Word(String),
StringLiteral(String),
NumLiteral(u64),
}
//TODO make this notion of Eq more sophisticated
#[derive(Debug, PartialEq, Clone)]
enum Sexp {
SymbolAtom(String),
StringAtom(String),
NumberAtom(u64),
BoolAtom(bool),
Cons(Box<Sexp>, Box<Sexp>),
Nil,
FnLiteral {
formal_params: Vec<String>,
body: Box<Sexp>
},
Primitive(PrimitiveFn)
}
#[derive(Debug, PartialEq, Clone)]
enum PrimitiveFn {
Plus, Minus, Mult, Div, Mod, Greater, Less, GreaterThanOrEqual, LessThanOrEqual, Display
}
impl PrimitiveFn {
fn apply(&self, evaled_operands: Vec<Sexp>) -> Result<Sexp, String> {
use self::Sexp::*;
use self::PrimitiveFn::*;
let op = self.clone();
Ok(match op {
Display => {
for arg in evaled_operands {
print!("{}\n", arg.print());
}
Nil
},
Plus | Mult => {
let mut result = match op { Plus => 0, Mult => 1, _ => unreachable!() };
for arg in evaled_operands {
if let NumberAtom(n) = arg {
if let Plus = op {
result += n;
} else if let Mult = op {
result *= n;
}
} else {
return Err(format!("Bad operand: {:?}", arg));
}
}
NumberAtom(result)
},
op => return Err(format!("Primitive op {:?} not implemented", op)),
})
}
}
impl Sexp {
fn print(&self) -> String {
use self::Sexp::*;
match self {
&BoolAtom(true) => format!("#t"),
&BoolAtom(false) => format!("#f"),
&SymbolAtom(ref sym) => format!("{}", sym),
&StringAtom(ref s) => format!("\"{}\"", s),
&NumberAtom(ref n) => format!("{}", n),
&Cons(ref car, ref cdr) => format!("({} . {})", car.print(), cdr.print()),
&Nil => format!("()"),
&FnLiteral { ref formal_params, .. } => format!("<lambda {:?}>", formal_params),
&Primitive(ref sym) => format!("<primitive \"{:?}\">", sym),
}
}
fn truthy(&self) -> bool {
use self::Sexp::*;
match self {
&BoolAtom(false) => false,
_ => true
}
}
}
fn tokenize(input: &mut Peekable<Chars>) -> Vec<Token> {
use self::Token::*;
let mut tokens = Vec::new();
loop {
match input.next() {
None => break,
Some('(') => tokens.push(LParen),
Some(')') => tokens.push(RParen),
Some('\'') => tokens.push(Quote),
Some(c) if c.is_whitespace() => continue,
Some(c) if c.is_numeric() => {
let tok: String = input.peeking_take_while(|next| next.is_numeric()).collect();
let n: u64 = format!("{}{}", c, tok).parse().unwrap();
tokens.push(NumLiteral(n));
},
Some('"') => {
let string: String = input.scan(false, |escape, cur_char| {
let seen_escape = *escape;
*escape = cur_char == '\\' && !seen_escape;
match (cur_char, seen_escape) {
('"', false) => None,
('\\', false) => Some(None),
(c, _) => Some(Some(c))
}
}).filter_map(|x| x).collect();
tokens.push(StringLiteral(string));
}
Some(c) => {
let sym: String = input.peeking_take_while(|next| {
match *next {
'(' | ')' => false,
c if c.is_whitespace() => false,
_ => true
}
}).collect();
tokens.push(Word(format!("{}{}", c, sym)));
}
}
}
tokens
}
fn parse(tokens: &mut Peekable<IntoIter<Token>>) -> Result<Sexp, String> {
use self::Token::*;
use self::Sexp::*;
match tokens.next() {
Some(Word(ref s)) if s == "#f" => Ok(BoolAtom(false)),
Some(Word(ref s)) if s == "#t" => Ok(BoolAtom(true)),
Some(Word(s)) => Ok(SymbolAtom(s)),
Some(StringLiteral(s)) => Ok(StringAtom(s)),
Some(LParen) => parse_sexp(tokens),
Some(RParen) => Err(format!("Unexpected ')'")),
Some(Quote) => {
let quoted = parse(tokens)?;
Ok(Cons(Box::new(SymbolAtom(format!("quote"))), Box::new(Cons(Box::new(quoted), Box::new(Nil)))))
},
Some(NumLiteral(n)) => Ok(NumberAtom(n)),
None => Err(format!("Unexpected end of input")),
}
}
fn parse_sexp(tokens: &mut Peekable<IntoIter<Token>>) -> Result<Sexp, String> {
use self::Token::*;
use self::Sexp::*;
let mut cell = Nil;
{
let mut cell_ptr = &mut cell;
loop {
match tokens.peek() {
None => return Err(format!("Unexpected end of input")),
Some(&RParen) => {
tokens.next();
break;
},
_ => {
let current = parse(tokens)?;
let new_cdr = Cons(Box::new(current), Box::new(Nil));
match cell_ptr {
&mut Cons(_, ref mut cdr) => **cdr = new_cdr,
&mut Nil => *cell_ptr = new_cdr,
_ => unreachable!()
};
let old_ptr = cell_ptr;
let new_ptr: &mut Sexp = match old_ptr { &mut Cons(_, ref mut cdr) => cdr, _ => unreachable!() } as &mut Sexp;
cell_ptr = new_ptr;
}
}
}
}
Ok(cell)
}

View File

@ -0,0 +1,12 @@
[package]
name = "schala-lang-codegen"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[lib]
proc-macro = true
[dependencies]
syn = { version = "0.15.12", features = ["full", "extra-traits", "fold"] }
quote = "0.6.8"

View File

@ -0,0 +1,54 @@
#![feature(box_patterns)]
#![recursion_limit="128"]
extern crate proc_macro;
#[macro_use]
extern crate quote;
#[macro_use]
extern crate syn;
use self::proc_macro::TokenStream;
use self::syn::fold::Fold;
struct RecursiveDescentFn {
}
impl Fold for RecursiveDescentFn {
fn fold_item_fn(&mut self, mut i: syn::ItemFn) -> syn::ItemFn {
let box block = i.block;
let ref ident = i.ident;
let new_block: syn::Block = parse_quote! {
{
let next_token_before_parse = self.token_handler.peek();
let record = ParseRecord {
production_name: stringify!(#ident).to_string(),
next_token: format!("{}", next_token_before_parse.to_string_with_metadata()),
level: self.parse_level,
};
self.parse_level += 1;
self.parse_record.push(record);
let result = { #block };
if self.parse_level != 0 {
self.parse_level -= 1;
}
result.map_err(|mut parse_error: ParseError| {
parse_error.production_name = Some(stringify!(#ident).to_string());
parse_error
})
}
};
i.block = Box::new(new_block);
i
}
}
#[proc_macro_attribute]
pub fn recursive_descent_method(_attr: TokenStream, item: TokenStream) -> TokenStream {
let input: syn::ItemFn = parse_macro_input!(item as syn::ItemFn);
let mut folder = RecursiveDescentFn {};
let output = folder.fold_item_fn(input);
TokenStream::from(quote!(#output))
}

View File

@ -0,0 +1,20 @@
[package]
name = "schala-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[dependencies]
itertools = "0.8.0"
take_mut = "0.2.2"
maplit = "1.0.1"
lazy_static = "1.3.0"
failure = "0.1.5"
ena = "0.11.0"
stopwatch = "0.0.7"
derivative = "1.0.3"
colored = "1.8"
radix_trie = "0.1.5"
schala-lang-codegen = { path = "../codegen" }
schala-repl = { path = "../../schala-repl" }

View File

@ -0,0 +1,307 @@
use std::rc::Rc;
use crate::derivative::Derivative;
mod walker;
mod visitor;
mod visitor_test;
mod operators;
pub use operators::*;
pub use visitor::ASTVisitor;
pub use walker::walk_ast;
/// An abstract identifier for an AST node
#[derive(Debug, PartialEq, Eq, Hash, Clone)]
pub struct ItemId {
idx: u32,
}
impl ItemId {
fn new(n: u32) -> ItemId {
ItemId { idx: n }
}
}
pub struct ItemIdStore {
last_idx: u32
}
impl ItemIdStore {
pub fn new() -> ItemIdStore {
ItemIdStore { last_idx: 0 }
}
/// Always returns an ItemId with internal value zero
#[cfg(test)]
pub fn new_id() -> ItemId {
ItemId { idx: 0 }
}
/// This limits the size of the AST to 2^32 tree elements
pub fn fresh(&mut self) -> ItemId {
let idx = self.last_idx;
self.last_idx += 1;
ItemId::new(idx)
}
}
#[derive(Derivative, Debug)]
#[derivative(PartialEq)]
pub struct AST {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub statements: Vec<Statement>
}
#[derive(Derivative, Debug, Clone)]
#[derivative(PartialEq)]
pub struct Statement {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub kind: StatementKind,
}
#[derive(Debug, PartialEq, Clone)]
pub enum StatementKind {
Expression(Expression),
Declaration(Declaration),
Import(ImportSpecifier),
Module(ModuleSpecifier),
}
pub type Block = Vec<Statement>;
pub type ParamName = Rc<String>;
#[derive(Debug, Derivative, Clone)]
#[derivative(PartialEq)]
pub struct QualifiedName {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub components: Vec<Rc<String>>,
}
#[derive(Debug, PartialEq, Clone)]
pub struct FormalParam {
pub name: ParamName,
pub default: Option<Expression>,
pub anno: Option<TypeIdentifier>
}
#[derive(Debug, PartialEq, Clone)]
pub enum Declaration {
FuncSig(Signature),
FuncDecl(Signature, Block),
TypeDecl {
name: TypeSingletonName,
body: TypeBody,
mutable: bool
},
//TODO this needs to be more sophisticated
TypeAlias {
alias: Rc<String>,
original: Rc<String>,
},
Binding {
name: Rc<String>,
constant: bool,
type_anno: Option<TypeIdentifier>,
expr: Expression,
},
Impl {
type_name: TypeIdentifier,
interface_name: Option<TypeSingletonName>,
block: Vec<Declaration>,
},
Interface {
name: Rc<String>,
signatures: Vec<Signature>
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct Signature {
pub name: Rc<String>,
pub operator: bool,
pub params: Vec<FormalParam>,
pub type_anno: Option<TypeIdentifier>,
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeBody(pub Vec<Variant>);
#[derive(Debug, PartialEq, Clone)]
pub enum Variant {
UnitStruct(Rc<String>),
TupleStruct(Rc<String>, Vec<TypeIdentifier>),
Record {
name: Rc<String>,
members: Vec<(Rc<String>, TypeIdentifier)>,
}
}
#[derive(Debug, Derivative, Clone)]
#[derivative(PartialEq)]
pub struct Expression {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub kind: ExpressionKind,
pub type_anno: Option<TypeIdentifier>
}
impl Expression {
pub fn new(id: ItemId, kind: ExpressionKind) -> Expression {
Expression { id, kind, type_anno: None }
}
pub fn with_anno(id: ItemId, kind: ExpressionKind, type_anno: TypeIdentifier) -> Expression {
Expression { id, kind, type_anno: Some(type_anno) }
}
}
#[derive(Debug, PartialEq, Clone)]
pub enum TypeIdentifier {
Tuple(Vec<TypeIdentifier>),
Singleton(TypeSingletonName)
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeSingletonName {
pub name: Rc<String>,
pub params: Vec<TypeIdentifier>,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ExpressionKind {
NatLiteral(u64),
FloatLiteral(f64),
StringLiteral(Rc<String>),
BoolLiteral(bool),
BinExp(BinOp, Box<Expression>, Box<Expression>),
PrefixExp(PrefixOp, Box<Expression>),
TupleLiteral(Vec<Expression>),
Value(QualifiedName),
NamedStruct {
name: QualifiedName,
fields: Vec<(Rc<String>, Expression)>,
},
Call {
f: Box<Expression>,
arguments: Vec<InvocationArgument>,
},
Index {
indexee: Box<Expression>,
indexers: Vec<Expression>,
},
IfExpression {
discriminator: Option<Box<Expression>>,
body: Box<IfExpressionBody>,
},
WhileExpression {
condition: Option<Box<Expression>>,
body: Block,
},
ForExpression {
enumerators: Vec<Enumerator>,
body: Box<ForBody>,
},
Lambda {
params: Vec<FormalParam>,
type_anno: Option<TypeIdentifier>,
body: Block,
},
ListLiteral(Vec<Expression>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum InvocationArgument {
Positional(Expression),
Keyword {
name: Rc<String>,
expr: Expression,
},
Ignored
}
#[derive(Debug, PartialEq, Clone)]
pub enum IfExpressionBody {
SimpleConditional {
then_case: Block,
else_case: Option<Block>
},
SimplePatternMatch {
pattern: Pattern,
then_case: Block,
else_case: Option<Block>
},
CondList(Vec<ConditionArm>)
}
#[derive(Debug, PartialEq, Clone)]
pub struct ConditionArm {
pub condition: Condition,
pub guard: Option<Expression>,
pub body: Block,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Condition {
Pattern(Pattern),
TruncatedOp(BinOp, Expression),
Expression(Expression),
Else,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Pattern {
Ignored,
TuplePattern(Vec<Pattern>),
Literal(PatternLiteral),
TupleStruct(QualifiedName, Vec<Pattern>),
Record(QualifiedName, Vec<(Rc<String>, Pattern)>),
VarOrName(QualifiedName),
}
#[derive(Debug, PartialEq, Clone)]
pub enum PatternLiteral {
NumPattern {
neg: bool,
num: ExpressionKind,
},
StringPattern(Rc<String>),
BoolPattern(bool),
}
#[derive(Debug, PartialEq, Clone)]
pub struct Enumerator {
pub id: Rc<String>,
pub generator: Expression,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ForBody {
MonadicReturn(Expression),
StatementBlock(Block),
}
#[derive(Debug, Derivative, Clone)]
#[derivative(PartialEq)]
pub struct ImportSpecifier {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub path_components: Vec<Rc<String>>,
pub imported_names: ImportedNames
}
#[derive(Debug, PartialEq, Clone)]
pub enum ImportedNames {
All,
LastOfPath,
List(Vec<Rc<String>>)
}
#[derive(Debug, PartialEq, Clone)]
pub struct ModuleSpecifier {
pub name: Rc<String>,
pub contents: Vec<Statement>,
}

View File

@ -0,0 +1,108 @@
use std::rc::Rc;
use std::str::FromStr;
use crate::tokenizing::TokenKind;
use crate::builtin::Builtin;
#[derive(Debug, PartialEq, Clone)]
pub struct PrefixOp {
sigil: Rc<String>,
pub builtin: Option<Builtin>,
}
impl PrefixOp {
#[allow(dead_code)]
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn is_prefix(op: &str) -> bool {
match op {
"+" => true,
"-" => true,
"!" => true,
_ => false
}
}
}
impl FromStr for PrefixOp {
type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> {
use Builtin::*;
let builtin = match s {
"+" => Ok(Increment),
"-" => Ok(Negate),
"!" => Ok(BooleanNot),
_ => Err(())
};
builtin.map(|builtin| PrefixOp { sigil: Rc::new(s.to_string()), builtin: Some(builtin) })
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct BinOp {
sigil: Rc<String>,
pub builtin: Option<Builtin>,
}
impl BinOp {
pub fn from_sigil(sigil: &str) -> BinOp {
let builtin = Builtin::from_str(sigil).ok();
BinOp { sigil: Rc::new(sigil.to_string()), builtin }
}
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn from_sigil_token(tok: &TokenKind) -> Option<BinOp> {
let s = token_kind_to_sigil(tok)?;
Some(BinOp::from_sigil(s))
}
pub fn min_precedence() -> i32 {
i32::min_value()
}
pub fn get_precedence_from_token(op_tok: &TokenKind) -> Option<i32> {
let s = token_kind_to_sigil(op_tok)?;
Some(binop_precedences(s))
}
}
fn token_kind_to_sigil<'a>(tok: &'a TokenKind) -> Option<&'a str> {
use self::TokenKind::*;
Some(match tok {
Operator(op) => op.as_str(),
Period => ".",
Pipe => "|",
Slash => "/",
LAngleBracket => "<",
RAngleBracket => ">",
Equals => "=",
_ => return None
})
}
fn binop_precedences(s: &str) -> i32 {
let default = 10_000_000;
match s {
"+" => 10,
"-" => 10,
"*" => 20,
"/" => 20,
"%" => 20,
"++" => 30,
"^" => 30,
"&" => 20,
"|" => 20,
">" => 20,
">=" => 20,
"<" => 20,
"<=" => 20,
"==" => 40,
"=" => 10,
"<=>" => 30,
_ => default,
}
}

View File

@ -0,0 +1,55 @@
use std::rc::Rc;
use crate::ast::*;
//TODO maybe these functions should take closures that return a KeepRecursing | StopHere type,
//or a tuple of (T, <that type>)
pub trait ASTVisitor: Sized {
type BlockHandler: BlockVisitor = ();
fn ast(&mut self, _ast: &AST) {}
fn block(&mut self) -> Self::BlockHandler { Self::BlockHandler::new() }
fn block_finished(&mut self, handler: Self::BlockHandler) {}
fn statement(&mut self, _statement: &Statement) {}
fn declaration(&mut self, _declaration: &Declaration) {}
fn signature(&mut self, _signature: &Signature) {}
fn type_declaration(&mut self, _name: &TypeSingletonName, _body: &TypeBody, _mutable: bool) {}
fn type_alias(&mut self, _alias: &Rc<String>, _original: &Rc<String>) {}
fn binding(&mut self, _name: &Rc<String>, _constant: bool, _type_anno: Option<&TypeIdentifier>, _expr: &Expression) {}
fn implemention(&mut self, _type_name: &TypeIdentifier, _interface_name: Option<&TypeSingletonName>, _block: &Vec<Declaration>) {}
fn interface(&mut self, _name: &Rc<String>, _signatures: &Vec<Signature>) {}
fn expression(&mut self, _expression: &Expression) {}
fn expression_kind(&mut self, _kind: &ExpressionKind) {}
fn type_annotation(&mut self, _type_anno: Option<&TypeIdentifier>) {}
fn named_struct(&mut self, _name: &QualifiedName, _fields: &Vec<(Rc<String>, Expression)>) {}
fn call(&mut self, _f: &Expression, _arguments: &Vec<InvocationArgument>) {}
fn index(&mut self, _indexee: &Expression, _indexers: &Vec<Expression>) {}
fn if_expression(&mut self, _discrim: Option<&Expression>, _body: &IfExpressionBody) {}
fn condition_arm(&mut self, _arm: &ConditionArm) {}
fn while_expression(&mut self, _condition: Option<&Expression>, _body: &Block) {}
fn for_expression(&mut self, _enumerators: &Vec<Enumerator>, _body: &ForBody) {}
fn lambda(&mut self, _params: &Vec<FormalParam>, _type_anno: Option<&TypeIdentifier>, _body: &Block) {}
fn invocation_argument(&mut self, _arg: &InvocationArgument) {}
fn formal_param(&mut self, _param: &FormalParam) {}
fn import(&mut self, _import: &ImportSpecifier) {}
fn module(&mut self, _module: &ModuleSpecifier) {}
fn qualified_name(&mut self, _name: &QualifiedName) {}
fn nat_literal(&mut self, _n: u64) {}
fn float_literal(&mut self, _f: f64) {}
fn string_literal(&mut self, _s: &Rc<String>) {}
fn bool_literal(&mut self, _b: bool) {}
fn binexp(&mut self, _op: &BinOp, _lhs: &Expression, _rhs: &Expression) {}
fn prefix_exp(&mut self, _op: &PrefixOp, _arg: &Expression) {}
fn pattern(&mut self, _pat: &Pattern) {}
}
pub trait BlockVisitor {
fn new() -> Self;
fn pre_block(&mut self) {}
fn post_block(&mut self) {}
}
impl BlockVisitor for () {
fn new() -> () { () }
}

View File

@ -0,0 +1,41 @@
#![cfg(test)]
use crate::ast::visitor::ASTVisitor;
use crate::ast::walker;
use crate::util::quick_ast;
struct Tester {
count: u64,
float_count: u64
}
impl ASTVisitor for Tester {
fn nat_literal(&mut self, _n: u64) {
self.count += 1;
}
fn float_literal(&mut self, _f: f64) {
self.float_count += 1;
}
}
#[test]
fn foo() {
let mut tester = Tester { count: 0, float_count: 0 };
let (ast, _) = quick_ast(r#"
import gragh
let a = 20 + 84
let b = 28 + 1 + 2 + 2.0
fn heh() {
let m = 9
}
"#);
walker::walk_ast(&mut tester, &ast);
assert_eq!(tester.count, 6);
assert_eq!(tester.float_count, 1);
}

View File

@ -0,0 +1,270 @@
#![allow(dead_code)]
use std::rc::Rc;
use crate::ast::*;
use crate::ast::visitor::{ASTVisitor, BlockVisitor};
use crate::util::deref_optional_box;
pub fn walk_ast<V: ASTVisitor>(v: &mut V, ast: &AST) {
v.ast(ast);
walk_block(v, &ast.statements);
}
fn walk_block<V: ASTVisitor>(v: &mut V, block: &Vec<Statement>) {
let mut block_handler = v.block();
block_handler.pre_block();
for s in block {
v.statement(s);
statement(v, s);
}
block_handler.post_block();
v.block_finished(block_handler);
}
fn statement<V: ASTVisitor>(v: &mut V, statement: &Statement) {
use StatementKind::*;
match statement.kind {
Expression(ref expr) => {
v.expression(expr);
expression(v, expr);
},
Declaration(ref decl) => {
v.declaration(decl);
declaration(v, decl);
},
Import(ref import_spec) => v.import(import_spec),
Module(ref module_spec) => {
v.module(module_spec);
walk_block(v, &module_spec.contents);
}
}
}
fn declaration<V: ASTVisitor>(v: &mut V, decl: &Declaration) {
use Declaration::*;
match decl {
FuncSig(sig) => {
v.signature(&sig);
signature(v, &sig);
},
FuncDecl(sig, block) => {
v.signature(&sig);
walk_block(v, block);
},
TypeDecl { name, body, mutable } => v.type_declaration(name, body, *mutable),
TypeAlias { alias, original} => v.type_alias(alias, original),
Binding { name, constant, type_anno, expr } => {
v.binding(name, *constant, type_anno.as_ref(), expr);
v.type_annotation(type_anno.as_ref());
v.expression(&expr);
expression(v, &expr);
},
Impl { type_name, interface_name, block } => {
v.implemention(type_name, interface_name.as_ref(), block);
}
Interface { name, signatures } => v.interface(name, signatures),
}
}
fn signature<V: ASTVisitor>(v: &mut V, signature: &Signature) {
for p in signature.params.iter() {
v.formal_param(p);
}
v.type_annotation(signature.type_anno.as_ref());
for p in signature.params.iter() {
formal_param(v, p);
}
}
fn expression<V: ASTVisitor>(v: &mut V, expression: &Expression) {
v.expression_kind(&expression.kind);
v.type_annotation(expression.type_anno.as_ref());
expression_kind(v, &expression.kind);
}
fn call<V: ASTVisitor>(v: &mut V, f: &Expression, args: &Vec<InvocationArgument>) {
v.expression(f);
expression(v, f);
for arg in args.iter() {
v.invocation_argument(arg);
invocation_argument(v, arg);
}
}
fn invocation_argument<V: ASTVisitor>(v: &mut V, arg: &InvocationArgument) {
use InvocationArgument::*;
match arg {
Positional(expr) => {
v.expression(expr);
expression(v, expr);
},
Keyword { expr, .. } => {
v.expression(expr);
expression(v, expr);
},
Ignored => (),
}
}
fn index<V: ASTVisitor>(v: &mut V, indexee: &Expression, indexers: &Vec<Expression>) {
v.expression(indexee);
for i in indexers.iter() {
v.expression(i);
}
}
fn named_struct<V: ASTVisitor>(v: &mut V, n: &QualifiedName, fields: &Vec<(Rc<String>, Expression)>) {
v.qualified_name(n);
for (_, expr) in fields.iter() {
v.expression(expr);
}
}
fn lambda<V: ASTVisitor>(v: &mut V, params: &Vec<FormalParam>, type_anno: Option<&TypeIdentifier>, body: &Block) {
for param in params {
v.formal_param(param);
formal_param(v, param);
}
v.type_annotation(type_anno);
walk_block(v, body);
}
fn formal_param<V: ASTVisitor>(v: &mut V, param: &FormalParam) {
param.default.as_ref().map(|p| {
v.expression(p);
expression(v, p);
});
v.type_annotation(param.anno.as_ref());
}
fn expression_kind<V: ASTVisitor>(v: &mut V, expression_kind: &ExpressionKind) {
use ExpressionKind::*;
match expression_kind {
NatLiteral(n) => v.nat_literal(*n),
FloatLiteral(f) => v.float_literal(*f),
StringLiteral(s) => v.string_literal(s),
BoolLiteral(b) => v.bool_literal(*b),
BinExp(op, lhs, rhs) => {
v.binexp(op, lhs, rhs);
expression(v, lhs);
expression(v, rhs);
},
PrefixExp(op, arg) => {
v.prefix_exp(op, arg);
expression(v, arg);
}
TupleLiteral(exprs) => {
for expr in exprs {
v.expression(expr);
expression(v, expr);
}
},
Value(name) => v.qualified_name(name),
NamedStruct { name, fields } => {
v.named_struct(name, fields);
named_struct(v, name, fields);
}
Call { f, arguments } => {
v.call(f, arguments);
call(v, f, arguments);
},
Index { indexee, indexers } => {
v.index(indexee, indexers);
index(v, indexee, indexers);
},
IfExpression { discriminator, body } => {
v.if_expression(deref_optional_box(discriminator), body);
discriminator.as_ref().map(|d| expression(v, d));
if_expression_body(v, body);
},
WhileExpression { condition, body } => v.while_expression(deref_optional_box(condition), body),
ForExpression { enumerators, body } => v.for_expression(enumerators, body),
Lambda { params , type_anno, body } => {
v.lambda(params, type_anno.as_ref(), body);
lambda(v, params, type_anno.as_ref(), body);
},
ListLiteral(exprs) => {
for expr in exprs {
v.expression(expr);
expression(v, expr);
}
},
}
}
fn if_expression_body<V: ASTVisitor>(v: &mut V, body: &IfExpressionBody) {
use IfExpressionBody::*;
match body {
SimpleConditional { then_case, else_case } => {
walk_block(v, then_case);
else_case.as_ref().map(|block| walk_block(v, block));
},
SimplePatternMatch { pattern, then_case, else_case } => {
v.pattern(pattern);
walk_pattern(v, pattern);
walk_block(v, then_case);
else_case.as_ref().map(|block| walk_block(v, block));
},
CondList(arms) => {
for arm in arms {
v.condition_arm(arm);
condition_arm(v, arm);
}
}
}
}
fn condition_arm<V: ASTVisitor>(v: &mut V, arm: &ConditionArm) {
use Condition::*;
v.condition_arm(arm);
match arm.condition {
Pattern(ref pat) => {
v.pattern(pat);
walk_pattern(v, pat);
},
TruncatedOp(ref _binop, ref expr) => {
v.expression(expr);
expression(v, expr);
},
Expression(ref expr) => {
v.expression(expr);
expression(v, expr);
},
_ => ()
}
arm.guard.as_ref().map(|guard| {
v.expression(guard);
expression(v, guard);
});
walk_block(v, &arm.body);
}
fn walk_pattern<V: ASTVisitor>(v: &mut V, pat: &Pattern) {
use Pattern::*;
match pat {
TuplePattern(patterns) => {
for pat in patterns {
v.pattern(pat);
walk_pattern(v, pat);
}
},
TupleStruct(qualified_name, patterns) => {
v.qualified_name(qualified_name);
for pat in patterns {
v.pattern(pat);
walk_pattern(v, pat);
}
},
Record(qualified_name, name_and_patterns) => {
v.qualified_name(qualified_name);
for (_, pat) in name_and_patterns {
v.pattern(pat);
walk_pattern(v, pat);
}
},
VarOrName(qualified_name) => {
v.qualified_name(qualified_name);
},
_ => ()
}
}

View File

@ -0,0 +1,102 @@
use std::str::FromStr;
use crate::typechecking::{TypeConst, Type};
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum Builtin {
Add,
Increment,
Subtract,
Negate,
Multiply,
Divide,
Quotient,
Modulo,
Exponentiation,
BitwiseAnd,
BitwiseOr,
BooleanAnd,
BooleanOr,
BooleanNot,
Equality,
LessThan,
LessThanOrEqual,
GreaterThan,
GreaterThanOrEqual,
Comparison,
FieldAccess,
IOPrint,
IOPrintLn,
IOGetLine,
Assignment,
Concatenate,
}
impl Builtin {
pub fn get_type(&self) -> Type {
use Builtin::*;
match self {
Add => ty!(Nat -> Nat -> Nat),
Subtract => ty!(Nat -> Nat -> Nat),
Multiply => ty!(Nat -> Nat -> Nat),
Divide => ty!(Nat -> Nat -> Float),
Quotient => ty!(Nat -> Nat -> Nat),
Modulo => ty!(Nat -> Nat -> Nat),
Exponentiation => ty!(Nat -> Nat -> Nat),
BitwiseAnd => ty!(Nat -> Nat -> Nat),
BitwiseOr => ty!(Nat -> Nat -> Nat),
BooleanAnd => ty!(Bool -> Bool -> Bool),
BooleanOr => ty!(Bool -> Bool -> Bool),
BooleanNot => ty!(Bool -> Bool),
Equality => ty!(Nat -> Nat -> Bool),
LessThan => ty!(Nat -> Nat -> Bool),
LessThanOrEqual => ty!(Nat -> Nat -> Bool),
GreaterThan => ty!(Nat -> Nat -> Bool),
GreaterThanOrEqual => ty!(Nat -> Nat -> Bool),
Comparison => ty!(Nat -> Nat -> Ordering),
FieldAccess => ty!(Unit),
IOPrint => ty!(Unit),
IOPrintLn => ty!(Unit) ,
IOGetLine => ty!(StringT),
Assignment => ty!(Unit),
Concatenate => ty!(StringT -> StringT -> StringT),
Increment => ty!(Nat -> Int),
Negate => ty!(Nat -> Int)
}
}
}
impl FromStr for Builtin {
type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> {
use Builtin::*;
Ok(match s {
"+" => Add,
"-" => Subtract,
"*" => Multiply,
"/" => Divide,
"quot" => Quotient,
"%" => Modulo,
"++" => Concatenate,
"^" => Exponentiation,
"&" => BitwiseAnd,
"&&" => BooleanAnd,
"|" => BitwiseOr,
"||" => BooleanOr,
"!" => BooleanNot,
">" => GreaterThan,
">=" => GreaterThanOrEqual,
"<" => LessThan,
"<=" => LessThanOrEqual,
"==" => Equality,
"=" => Assignment,
"<=>" => Comparison,
"." => FieldAccess,
"print" => IOPrint,
"println" => IOPrintLn,
"getline" => IOGetLine,
_ => return Err(())
})
}
}

View File

@ -0,0 +1,10 @@
use crate::ast::*;
impl AST {
pub fn compact_debug(&self) -> String {
format!("{:?}", self)
}
pub fn expanded_debug(&self) -> String {
format!("{:#?}", self)
}
}

View File

@ -0,0 +1,455 @@
use std::rc::Rc;
use std::fmt::Write;
use std::io;
use itertools::Itertools;
use crate::schala::SymbolTableHandle;
use crate::util::ScopeStack;
use crate::reduced_ast::{BoundVars, ReducedAST, Stmt, Expr, Lit, Func, Alternative, Subpattern};
use crate::symbol_table::{SymbolSpec, Symbol, SymbolTable, FullyQualifiedSymbolName};
use crate::builtin::Builtin;
mod test;
pub struct State<'a> {
values: ScopeStack<'a, Rc<String>, ValueEntry>,
}
impl<'a> State<'a> {
pub fn new() -> State<'a> {
let values = ScopeStack::new(Some(format!("global")));
State { values }
}
pub fn debug_print(&self) -> String {
format!("Values: {:?}", self.values)
}
fn new_frame(&'a self, items: &'a Vec<Node>, bound_vars: &BoundVars) -> State<'a> {
let mut inner_state = State {
values: self.values.new_scope(None),
};
for (bound_var, val) in bound_vars.iter().zip(items.iter()) {
if let Some(bv) = bound_var.as_ref() {
inner_state.values.insert(bv.clone(), ValueEntry::Binding { constant: true, val: val.clone() });
}
}
inner_state
}
}
#[derive(Debug, Clone)]
enum Node {
Expr(Expr),
PrimObject {
name: Rc<String>,
tag: usize,
items: Vec<Node>,
},
PrimTuple {
items: Vec<Node>
}
}
fn paren_wrapped_vec(terms: impl Iterator<Item=String>) -> String {
let mut buf = String::new();
write!(buf, "(").unwrap();
for term in terms.map(|e| Some(e)).intersperse(None) {
match term {
Some(e) => write!(buf, "{}", e).unwrap(),
None => write!(buf, ", ").unwrap(),
};
}
write!(buf, ")").unwrap();
buf
}
impl Node {
fn to_repl(&self) -> String {
match self {
Node::Expr(e) => e.to_repl(),
Node::PrimObject { name, items, .. } if items.len() == 0 => format!("{}", name),
Node::PrimObject { name, items, .. } => format!("{}{}", name, paren_wrapped_vec(items.iter().map(|x| x.to_repl()))),
Node::PrimTuple { items } => format!("{}", paren_wrapped_vec(items.iter().map(|x| x.to_repl()))),
}
}
fn is_true(&self) -> bool {
match self {
Node::Expr(Expr::Lit(crate::reduced_ast::Lit::Bool(true))) => true,
_ => false,
}
}
}
#[derive(Debug)]
enum ValueEntry {
Binding {
constant: bool,
val: /*FullyEvaluatedExpr*/ Node, //TODO make this use a subtype to represent fully evaluatedness
}
}
type EvalResult<T> = Result<T, String>;
impl Expr {
fn to_node(self) -> Node {
Node::Expr(self)
}
fn to_repl(&self) -> String {
use self::Lit::*;
use self::Func::*;
match self {
Expr::Lit(ref l) => match l {
Nat(n) => format!("{}", n),
Int(i) => format!("{}", i),
Float(f) => format!("{}", f),
Bool(b) => format!("{}", b),
StringLit(s) => format!("\"{}\"", s),
},
Expr::Func(f) => match f {
BuiltIn(builtin) => format!("<built-in function '{:?}'>", builtin),
UserDefined { name: None, .. } => format!("<function>"),
UserDefined { name: Some(name), .. } => format!("<function '{}'>", name),
},
Expr::Constructor { type_name, arity, .. } => {
format!("<constructor for `{}` arity {}>", type_name, arity)
},
Expr::Tuple(exprs) => paren_wrapped_vec(exprs.iter().map(|x| x.to_repl())),
_ => format!("{:?}", self),
}
}
fn replace_conditional_target_sigil(self, replacement: &Expr) -> Expr {
use self::Expr::*;
match self {
ConditionalTargetSigilValue => replacement.clone(),
Unit | Lit(_) | Func(_) | Sym(_) | Constructor { .. } |
CaseMatch { .. } | UnimplementedSigilValue | ReductionError(_) => self,
Tuple(exprs) => Tuple(exprs.into_iter().map(|e| e.replace_conditional_target_sigil(replacement)).collect()),
Call { f, args } => {
let new_args = args.into_iter().map(|e| e.replace_conditional_target_sigil(replacement)).collect();
Call { f, args: new_args }
},
Conditional { .. } => panic!("Dunno if I need this, but if so implement"),
Assign { .. } => panic!("I'm pretty sure I don't need this"),
}
}
}
impl<'a> State<'a> {
pub fn evaluate(&mut self, ast: ReducedAST, repl: bool) -> Vec<Result<String, String>> {
let mut acc = vec![];
// handle prebindings
for statement in ast.0.iter() {
self.prebinding(statement);
}
for statement in ast.0 {
match self.statement(statement) {
Ok(Some(ref output)) if repl => {
acc.push(Ok(output.to_repl()))
},
Ok(_) => (),
Err(error) => {
acc.push(Err(format!("Runtime error: {}", error)));
return acc;
},
}
}
acc
}
fn prebinding(&mut self, stmt: &Stmt) {
match stmt {
Stmt::PreBinding { name, func } => {
let v_entry = ValueEntry::Binding { constant: true, val: Node::Expr(Expr::Func(func.clone())) };
self.values.insert(name.clone(), v_entry);
},
Stmt::Expr(_expr) => {
//TODO have this support things like nested function defs
},
_ => ()
}
}
fn statement(&mut self, stmt: Stmt) -> EvalResult<Option<Node>> {
match stmt {
Stmt::Binding { name, constant, expr } => {
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant, val });
Ok(None)
},
Stmt::Expr(expr) => Ok(Some(self.expression(expr.to_node())?)),
Stmt::PreBinding {..} | Stmt::Noop => Ok(None),
}
}
fn block(&mut self, stmts: Vec<Stmt>) -> EvalResult<Node> {
let mut ret = None;
for stmt in stmts {
ret = self.statement(stmt)?;
}
Ok(ret.unwrap_or(Node::Expr(Expr::Unit)))
}
fn expression(&mut self, node: Node) -> EvalResult<Node> {
use self::Expr::*;
match node {
t @ Node::PrimTuple { .. } => Ok(t),
obj @ Node::PrimObject { .. } => Ok(obj),
Node::Expr(expr) => match expr {
literal @ Lit(_) => Ok(Node::Expr(literal)),
Call { box f, args } => self.call_expression(f, args),
Sym(name) => Ok(match self.values.lookup(&name) {
Some(ValueEntry::Binding { val, .. }) => val.clone(),
None => return Err(format!("Could not look up symbol {}", name))
}),
Constructor { arity, ref name, tag, .. } if arity == 0 => Ok(Node::PrimObject { name: name.clone(), tag, items: vec![] }),
constructor @ Constructor { .. } => Ok(Node::Expr(constructor)),
func @ Func(_) => Ok(Node::Expr(func)),
Tuple(exprs) => {
let nodes = exprs.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
Ok(Node::PrimTuple { items: nodes })
},
Conditional { box cond, then_clause, else_clause } => self.conditional(cond, then_clause, else_clause),
Assign { box val, box expr } => self.assign_expression(val, expr),
Unit => Ok(Node::Expr(Unit)),
CaseMatch { box cond, alternatives } => self.case_match_expression(cond, alternatives),
ConditionalTargetSigilValue => Ok(Node::Expr(ConditionalTargetSigilValue)),
UnimplementedSigilValue => Err(format!("Sigil value eval not implemented")),
ReductionError(err) => Err(format!("Reduction error: {}", err)),
}
}
}
fn call_expression(&mut self, f: Expr, args: Vec<Expr>) -> EvalResult<Node> {
use self::Expr::*;
match self.expression(Node::Expr(f))? {
Node::Expr(Constructor { type_name, name, tag, arity }) => self.apply_data_constructor(type_name, name, tag, arity, args),
Node::Expr(Func(f)) => self.apply_function(f, args),
other => return Err(format!("Tried to call {:?} which is not a function or data constructor", other)),
}
}
fn apply_data_constructor(&mut self, _type_name: Rc<String>, name: Rc<String>, tag: usize, arity: usize, args: Vec<Expr>) -> EvalResult<Node> {
if arity != args.len() {
return Err(format!("Data constructor {} requires {} arg(s)", name, arity));
}
let evaled_args = args.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
//let evaled_args = vec![];
Ok(Node::PrimObject {
name: name.clone(),
items: evaled_args,
tag
})
}
fn apply_function(&mut self, f: Func, args: Vec<Expr>) -> EvalResult<Node> {
match f {
Func::BuiltIn(builtin) => Ok(self.apply_builtin(builtin, args)?),
Func::UserDefined { params, body, name } => {
if params.len() != args.len() {
return Err(format!("calling a {}-argument function with {} args", params.len(), args.len()))
}
let mut func_state = State {
values: self.values.new_scope(name.map(|n| format!("{}", n))),
};
for (param, val) in params.into_iter().zip(args.into_iter()) {
let val = func_state.expression(Node::Expr(val))?;
func_state.values.insert(param, ValueEntry::Binding { constant: true, val });
}
// TODO figure out function return semantics
func_state.block(body)
}
}
}
fn apply_builtin(&mut self, builtin: Builtin, args: Vec<Expr>) -> EvalResult<Node> {
use self::Expr::*;
use self::Lit::*;
use Builtin::*;
let evaled_args: Result<Vec<Node>, String> = args.into_iter().map(|arg| self.expression(arg.to_node()))
.collect();
let evaled_args = evaled_args?;
Ok(match (builtin, evaled_args.as_slice()) {
(FieldAccess, &[Node::PrimObject { .. }]) => {
//TODO implement field access
unimplemented!()
},
(binop, &[Node::Expr(ref lhs), Node::Expr(ref rhs)]) => match (binop, lhs, rhs) {
/* binops */
(Add, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l + r)),
(Concatenate, Lit(StringLit(ref s1)), Lit(StringLit(ref s2))) => Lit(StringLit(Rc::new(format!("{}{}", s1, s2)))),
(Subtract, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l - r)),
(Multiply, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l * r)),
(Divide, Lit(Nat(l)), Lit(Nat(r))) => Lit(Float((*l as f64)/ (*r as f64))),
(Quotient, Lit(Nat(l)), Lit(Nat(r))) => if *r == 0 {
return Err(format!("divide by zero"));
} else {
Lit(Nat(l / r))
},
(Modulo, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l % r)),
(Exponentiation, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l ^ r)),
(BitwiseAnd, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l & r)),
(BitwiseOr, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l | r)),
/* comparisons */
(Equality, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l == r)),
(Equality, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l == r)),
(Equality, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l == r)),
(Equality, Lit(Bool(l)), Lit(Bool(r))) => Lit(Bool(l == r)),
(Equality, Lit(StringLit(ref l)), Lit(StringLit(ref r))) => Lit(Bool(l == r)),
(LessThan, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l < r)),
(LessThan, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l < r)),
(LessThan, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l < r)),
(LessThanOrEqual, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l <= r)),
(LessThanOrEqual, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l <= r)),
(LessThanOrEqual, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l <= r)),
(GreaterThan, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l > r)),
(GreaterThan, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l > r)),
(GreaterThan, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l > r)),
(GreaterThanOrEqual, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l >= r)),
(GreaterThanOrEqual, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l >= r)),
(GreaterThanOrEqual, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l >= r)),
_ => return Err("No valid binop".to_string())
}.to_node(),
(prefix, &[Node::Expr(ref arg)]) => match (prefix, arg) {
(BooleanNot, Lit(Bool(true))) => Lit(Bool(false)),
(BooleanNot, Lit(Bool(false))) => Lit(Bool(true)),
(Negate, Lit(Nat(n))) => Lit(Int(-1*(*n as i64))),
(Negate, Lit(Int(n))) => Lit(Int(-1*(*n as i64))),
(Increment, Lit(Int(n))) => Lit(Int(*n)),
(Increment, Lit(Nat(n))) => Lit(Nat(*n)),
_ => return Err("No valid prefix op".to_string())
}.to_node(),
/* builtin functions */
(IOPrint, &[ref anything]) => {
print!("{}", anything.to_repl());
Expr::Unit.to_node()
},
(IOPrintLn, &[ref anything]) => {
println!("{}", anything.to_repl());
Expr::Unit.to_node()
},
(IOGetLine, &[]) => {
let mut buf = String::new();
io::stdin().read_line(&mut buf).expect("Error readling line in 'getline'");
Lit(StringLit(Rc::new(buf.trim().to_string()))).to_node()
},
(x, args) => return Err(format!("bad or unimplemented builtin {:?} | {:?}", x, args)),
})
}
fn conditional(&mut self, cond: Expr, then_clause: Vec<Stmt>, else_clause: Vec<Stmt>) -> EvalResult<Node> {
let cond = self.expression(Node::Expr(cond))?;
Ok(match cond {
Node::Expr(Expr::Lit(Lit::Bool(true))) => self.block(then_clause)?,
Node::Expr(Expr::Lit(Lit::Bool(false))) => self.block(else_clause)?,
_ => return Err(format!("Conditional with non-boolean condition"))
})
}
fn assign_expression(&mut self, val: Expr, expr: Expr) -> EvalResult<Node> {
let name = match val {
Expr::Sym(name) => name,
_ => return Err(format!("Trying to assign to a non-value")),
};
let constant = match self.values.lookup(&name) {
None => return Err(format!("Constant {} is undefined", name)),
Some(ValueEntry::Binding { constant, .. }) => constant.clone(),
};
if constant {
return Err(format!("trying to update {}, a non-mutable binding", name));
}
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant: false, val });
Ok(Node::Expr(Expr::Unit))
}
fn guard_passes(&mut self, guard: &Option<Expr>, cond: &Node) -> EvalResult<bool> {
if let Some(ref guard_expr) = guard {
let guard_expr = match cond {
Node::Expr(ref e) => guard_expr.clone().replace_conditional_target_sigil(e),
_ => guard_expr.clone()
};
Ok(self.expression(guard_expr.to_node())?.is_true())
} else {
Ok(true)
}
}
fn case_match_expression(&mut self, cond: Expr, alternatives: Vec<Alternative>) -> EvalResult<Node> {
//TODO need to handle recursive subpatterns
let all_subpatterns_pass = |state: &mut State, subpatterns: &Vec<Option<Subpattern>>, items: &Vec<Node>| -> EvalResult<bool> {
if subpatterns.len() == 0 {
return Ok(true)
}
if items.len() != subpatterns.len() {
return Err(format!("Subpattern length isn't correct items {} subpatterns {}", items.len(), subpatterns.len()));
}
for (maybe_subp, cond) in subpatterns.iter().zip(items.iter()) {
if let Some(subp) = maybe_subp {
if !state.guard_passes(&subp.guard, &cond)? {
return Ok(false)
}
}
}
Ok(true)
};
let cond = self.expression(Node::Expr(cond))?;
for alt in alternatives {
// no matter what type of condition we have, ignore alternative if the guard evaluates false
if !self.guard_passes(&alt.matchable.guard, &cond)? {
continue;
}
match cond {
Node::PrimObject { ref tag, ref items, .. } => {
if alt.matchable.tag.map(|t| t == *tag).unwrap_or(true) {
let mut inner_state = self.new_frame(items, &alt.matchable.bound_vars);
if all_subpatterns_pass(&mut inner_state, &alt.matchable.subpatterns, items)? {
return inner_state.block(alt.item);
} else {
continue;
}
}
},
Node::PrimTuple { ref items } => {
let mut inner_state = self.new_frame(items, &alt.matchable.bound_vars);
if all_subpatterns_pass(&mut inner_state, &alt.matchable.subpatterns, items)? {
return inner_state.block(alt.item);
} else {
continue;
}
},
Node::Expr(ref _e) => {
if let None = alt.matchable.tag {
return self.block(alt.item)
}
}
}
}
Err(format!("{:?} failed pattern match", cond))
}
}

View File

@ -0,0 +1,269 @@
#![cfg(test)]
use std::cell::RefCell;
use std::rc::Rc;
use crate::symbol_table::SymbolTable;
use crate::scope_resolution::ScopeResolver;
use crate::reduced_ast::reduce;
use crate::eval::State;
fn evaluate_all_outputs(input: &str) -> Vec<Result<String, String>> {
let (mut ast, source_map) = crate::util::quick_ast(input);
let source_map = Rc::new(RefCell::new(source_map));
let symbol_table = Rc::new(RefCell::new(SymbolTable::new(source_map)));
symbol_table.borrow_mut().add_top_level_symbols(&ast).unwrap();
{
let mut scope_resolver = ScopeResolver::new(symbol_table.clone());
let _ = scope_resolver.resolve(&mut ast);
}
let reduced = reduce(&ast, &symbol_table.borrow());
let mut state = State::new();
let all_output = state.evaluate(reduced, true);
all_output
}
macro_rules! test_in_fresh_env {
($string:expr, $correct:expr) => {
{
let all_output = evaluate_all_outputs($string);
let ref output = all_output.last().unwrap();
assert_eq!(**output, Ok($correct.to_string()));
}
}
}
#[test]
fn test_basic_eval() {
test_in_fresh_env!("1 + 2", "3");
test_in_fresh_env!("let mut a = 1; a = 2", "Unit");
/*
test_in_fresh_env!("let mut a = 1; a = 2; a", "2");
test_in_fresh_env!(r#"("a", 1 + 2)"#, r#"("a", 3)"#);
*/
}
#[test]
fn op_eval() {
test_in_fresh_env!("- 13", "-13");
test_in_fresh_env!("10 - 2", "8");
}
#[test]
fn function_eval() {
test_in_fresh_env!("fn oi(x) { x + 1 }; oi(4)", "5");
test_in_fresh_env!("fn oi(x) { x + 1 }; oi(1+2)", "4");
}
#[test]
fn scopes() {
let scope_ok = r#"
let a = 20
fn haha() {
let a = 10
a
}
haha()
"#;
test_in_fresh_env!(scope_ok, "10");
let scope_ok = r#"
let a = 20
fn queque() {
let a = 10
a
}
a
"#;
test_in_fresh_env!(scope_ok, "20");
}
#[test]
fn if_is_patterns() {
let source = r#"
type Option<T> = Some(T) | None
let x = Option::Some(9); if x is Option::Some(q) then { q } else { 0 }"#;
test_in_fresh_env!(source, "9");
let source = r#"
type Option<T> = Some(T) | None
let x = Option::None; if x is Option::Some(q) then { q } else { 0 }"#;
test_in_fresh_env!(source, "0");
}
#[test]
fn full_if_matching() {
let source = r#"
type Option<T> = Some(T) | None
let a = Option::None
if a { is Option::None then 4, is Option::Some(x) then x }
"#;
test_in_fresh_env!(source, "4");
let source = r#"
type Option<T> = Some(T) | None
let a = Option::Some(99)
if a { is Option::None then 4, is Option::Some(x) then x }
"#;
test_in_fresh_env!(source, "99");
let source = r#"
let a = 10
if a { is 10 then "x", is 4 then "y" }
"#;
test_in_fresh_env!(source, "\"x\"");
let source = r#"
let a = 10
if a { is 15 then "x", is 10 then "y" }
"#;
test_in_fresh_env!(source, "\"y\"");
}
#[test]
fn string_pattern() {
let source = r#"
let a = "foo"
if a { is "foo" then "x", is _ then "y" }
"#;
test_in_fresh_env!(source, "\"x\"");
}
#[test]
fn boolean_pattern() {
let source = r#"
let a = true
if a {
is true then "x",
is false then "y"
}
"#;
test_in_fresh_env!(source, "\"x\"");
}
#[test]
fn boolean_pattern_2() {
let source = r#"
let a = false
if a { is true then "x", is false then "y" }
"#;
test_in_fresh_env!(source, "\"y\"");
}
#[test]
fn ignore_pattern() {
let source = r#"
type Option<T> = Some(T) | None
if Option::Some(10) {
is _ then "hella"
}
"#;
test_in_fresh_env!(source, "\"hella\"");
}
#[test]
fn tuple_pattern() {
let source = r#"
if (1, 2) {
is (1, x) then x,
is _ then 99
}
"#;
test_in_fresh_env!(source, 2);
}
#[test]
fn tuple_pattern_2() {
let source = r#"
if (1, 2) {
is (10, x) then x,
is (y, x) then x + y
}
"#;
test_in_fresh_env!(source, 3);
}
#[test]
fn tuple_pattern_3() {
let source = r#"
if (1, 5) {
is (10, x) then x,
is (1, x) then x
}
"#;
test_in_fresh_env!(source, 5);
}
#[test]
fn tuple_pattern_4() {
let source = r#"
if (1, 5) {
is (10, x) then x,
is (1, x) then x,
}
"#;
test_in_fresh_env!(source, 5);
}
#[test]
fn prim_obj_pattern() {
let source = r#"
type Stuff = Mulch(Nat) | Jugs(Nat, String) | Mardok
let a = Stuff::Mulch(20)
let b = Stuff::Jugs(1, "haha")
let c = Stuff::Mardok
let x = if a {
is Stuff::Mulch(20) then "x",
is _ then "ERR"
}
let y = if b {
is Stuff::Mulch(n) then "ERR",
is Stuff::Jugs(2, _) then "ERR",
is Stuff::Jugs(1, s) then s,
is _ then "ERR",
}
let z = if c {
is Stuff::Jugs(_, _) then "ERR",
is Stuff::Mardok then "NIGH",
is _ then "ERR",
}
(x, y, z)
"#;
test_in_fresh_env!(source, r#"("x", "haha", "NIGH")"#);
}
#[test]
fn basic_lambda_syntax() {
let source = r#"
let q = \(x, y) { x * y }
let x = q(5,2)
let y = \(m, n, o) { m + n + o }(1,2,3)
(x, y)
"#;
test_in_fresh_env!(source, r"(10, 6)");
}
#[test]
fn lambda_syntax_2() {
let source = r#"
fn milta() {
\(x) { x + 33 }
}
milta()(10)
"#;
test_in_fresh_env!(source, "43");
}
#[test]
fn import_all() {
let source = r#"
type Option<T> = Some(T) | None
import Option::*
let x = Some(9); if x is Some(q) then { q } else { 0 }"#;
test_in_fresh_env!(source, "9");
}

View File

@ -0,0 +1,46 @@
#![feature(associated_type_defaults)] //needed for Visitor trait
#![feature(trace_macros)]
#![feature(slice_patterns, box_patterns, box_syntax)]
//! `schala-lang` is where the Schala programming language is actually implemented.
//! It defines the `Schala` type, which contains the state for a Schala REPL, and implements
//! `ProgrammingLanguageInterface` and the chain of compiler passes for it.
extern crate itertools;
#[macro_use]
extern crate lazy_static;
#[macro_use]
extern crate maplit;
extern crate schala_repl;
#[macro_use]
extern crate schala_lang_codegen;
extern crate ena;
extern crate derivative;
extern crate colored;
extern crate radix_trie;
macro_rules! bx {
($e:expr) => { Box::new($e) }
}
#[macro_use]
mod util;
#[macro_use]
mod typechecking;
mod debugging;
mod tokenizing;
mod ast;
mod parsing;
#[macro_use]
mod symbol_table;
mod scope_resolution;
mod builtin;
mod reduced_ast;
mod eval;
mod source_map;
mod schala;
pub use schala::Schala;

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,828 @@
#![cfg(test)]
use std::cell::RefCell;
use std::rc::Rc;
use std::str::FromStr;
use super::{Parser, ParseResult, tokenize};
use crate::ast::*;
use super::Declaration::*;
use super::Signature;
use super::TypeIdentifier::*;
use super::TypeSingletonName;
use super::ExpressionKind::*;
use super::Variant::*;
use super::ForBody::*;
fn make_parser(input: &str) -> Parser {
let source_map = crate::source_map::SourceMap::new();
let source_map_handle = Rc::new(RefCell::new(source_map));
let tokens: Vec<crate::tokenizing::Token> = tokenize(input);
let mut parser = super::Parser::new(source_map_handle);
parser.add_new_tokens(tokens);
parser
}
fn parse(input: &str) -> ParseResult<AST> {
let mut parser = make_parser(input);
parser.parse()
}
macro_rules! parse_test {
($string:expr, $correct:expr) => {
assert_eq!(parse($string).unwrap(), $correct)
};
}
macro_rules! parse_test_wrap_ast {
($string:expr, $correct:expr) => { parse_test!($string, AST { id: ItemIdStore::new_id(), statements: vec![$correct] }) }
}
macro_rules! parse_error {
($string:expr) => { assert!(parse($string).is_err()) }
}
macro_rules! qname {
( $( $component:expr),* ) => {
{
let mut components = vec![];
$(
components.push(rc!($component));
)*
QualifiedName { components, id: ItemIdStore::new_id() }
}
};
}
macro_rules! val {
($var:expr) => { Value(QualifiedName { components: vec![Rc::new($var.to_string())], id: ItemIdStore::new_id() }) };
}
macro_rules! ty {
($name:expr) => { Singleton(tys!($name)) }
}
macro_rules! tys {
($name:expr) => { TypeSingletonName { name: Rc::new($name.to_string()), params: vec![] } };
}
macro_rules! decl {
($expr_type:expr) => {
Statement { id: ItemIdStore::new_id(), kind: StatementKind::Declaration($expr_type) }
};
}
macro_rules! import {
($import_spec:expr) => {
Statement { id: ItemIdStore::new_id(), kind: StatementKind::Import($import_spec) }
}
}
macro_rules! module {
($module_spec:expr) => {
Statement { id: ItemIdStore::new_id(), kind: StatementKind::Module($module_spec) }
}
}
macro_rules! ex {
($expr_type:expr) => { Expression::new(ItemIdStore::new_id(), $expr_type) };
($expr_type:expr, $type_anno:expr) => { Expression::with_anno(ItemIdStore::new_id(), $expr_type, $type_anno) };
(s $expr_text:expr) => {
{
let mut parser = make_parser($expr_text);
parser.expression().unwrap()
}
};
}
macro_rules! inv {
($expr_type:expr) => { InvocationArgument::Positional($expr_type) }
}
macro_rules! binexp {
($op:expr, $lhs:expr, $rhs:expr) => { BinExp(BinOp::from_sigil($op), bx!(Expression::new(ItemIdStore::new_id(), $lhs).into()), bx!(Expression::new(ItemIdStore::new_id(), $rhs).into())) }
}
macro_rules! prefexp {
($op:expr, $lhs:expr) => { PrefixExp(PrefixOp::from_str($op).unwrap(), bx!(Expression::new(ItemIdStore::new_id(), $lhs).into())) }
}
macro_rules! exst {
($expr_type:expr) => { Statement { id: ItemIdStore::new_id(), kind: StatementKind::Expression(Expression::new(ItemIdStore::new_id(), $expr_type).into())} };
($expr_type:expr, $type_anno:expr) => { Statement { id: ItemIdStore::new_id(), kind: StatementKind::Expression(Expression::with_anno(ItemIdStore::new_id(), $expr_type, $type_anno).into())} };
($op:expr, $lhs:expr, $rhs:expr) => { Statement { id: ItemIdStore::new_id(), ,kind: StatementKind::Expression(ex!(binexp!($op, $lhs, $rhs)))}
};
(s $statement_text:expr) => {
{
let mut parser = make_parser($statement_text);
parser.statement().unwrap()
}
}
}
#[test]
fn parsing_number_literals_and_binexps() {
parse_test_wrap_ast! { ".2", exst!(FloatLiteral(0.2)) };
parse_test_wrap_ast! { "8.1", exst!(FloatLiteral(8.1)) };
parse_test_wrap_ast! { "0b010", exst!(NatLiteral(2)) };
parse_test_wrap_ast! { "0b0_1_0_", exst!(NatLiteral(2)) }
parse_test_wrap_ast! {"0xff", exst!(NatLiteral(255)) };
parse_test_wrap_ast! {"0xf_f_", exst!(NatLiteral(255)) };
parse_test_wrap_ast! {"0xf_f_+1", exst!(binexp!("+", NatLiteral(255), NatLiteral(1))) };
parse_test! {"3; 4; 4.3",
AST {
id: ItemIdStore::new_id(),
statements: vec![exst!(NatLiteral(3)), exst!(NatLiteral(4)),
exst!(FloatLiteral(4.3))]
}
};
parse_test_wrap_ast!("1 + 2 * 3",
exst!(binexp!("+", NatLiteral(1), binexp!("*", NatLiteral(2), NatLiteral(3))))
);
parse_test_wrap_ast!("1 * 2 + 3",
exst!(binexp!("+", binexp!("*", NatLiteral(1), NatLiteral(2)), NatLiteral(3)))
) ;
parse_test_wrap_ast!("1 && 2", exst!(binexp!("&&", NatLiteral(1), NatLiteral(2))));
parse_test_wrap_ast!("1 + 2 * 3 + 4", exst!(
binexp!("+",
binexp!("+", NatLiteral(1), binexp!("*", NatLiteral(2), NatLiteral(3))),
NatLiteral(4))));
parse_test_wrap_ast!("(1 + 2) * 3",
exst!(binexp!("*", binexp!("+", NatLiteral(1), NatLiteral(2)), NatLiteral(3))));
parse_test_wrap_ast!(".1 + .2", exst!(binexp!("+", FloatLiteral(0.1), FloatLiteral(0.2))));
parse_test_wrap_ast!("1 / 2", exst!(binexp!("/", NatLiteral(1), NatLiteral(2))));
}
#[test]
fn parsing_tuples() {
parse_test_wrap_ast!("()", exst!(TupleLiteral(vec![])));
parse_test_wrap_ast!("(\"hella\", 34)", exst!(
TupleLiteral(
vec![ex!(s r#""hella""#).into(), ex!(s "34").into()]
)
));
parse_test_wrap_ast!("((1+2), \"slough\")", exst!(TupleLiteral(vec![
ex!(binexp!("+", NatLiteral(1), NatLiteral(2))).into(),
ex!(StringLiteral(rc!(slough))).into(),
])))
}
#[test]
fn parsing_identifiers() {
parse_test_wrap_ast!("a", exst!(val!("a")));
parse_test_wrap_ast!("some_value", exst!(val!("some_value")));
parse_test_wrap_ast!("a + b", exst!(binexp!("+", val!("a"), val!("b"))));
//parse_test!("a[b]", AST(vec![Expression(
//parse_test!("a[]", <- TODO THIS NEEDS TO FAIL
//parse_test("a()[b]()[d]")
//TODO fix this parsing stuff
/*
parse_test! { "perspicacity()[a]", AST(vec![
exst!(Index {
indexee: bx!(ex!(Call { f: bx!(ex!(val!("perspicacity"))), arguments: vec![] })),
indexers: vec![ex!(val!("a"))]
})
])
}
*/
parse_test_wrap_ast!("a[b,c]", exst!(Index { indexee: bx!(ex!(val!("a"))), indexers: vec![ex!(val!("b")), ex!(val!("c"))]} ));
parse_test_wrap_ast!("None", exst!(val!("None")));
parse_test_wrap_ast!("Pandas { a: x + y }",
exst!(NamedStruct { name: qname!(Pandas), fields: vec![(rc!(a), ex!(binexp!("+", val!("x"), val!("y"))))]})
);
parse_test_wrap_ast! { "Pandas { a: n, b: q, }",
exst!(NamedStruct { name: qname!(Pandas), fields:
vec![(rc!(a), ex!(val!("n"))), (rc!(b), ex!(val!("q")))]
}
)
};
}
#[test]
fn qualified_identifiers() {
parse_test_wrap_ast! {
"let q_q = Yolo::Swaggins",
decl!(Binding { name: rc!(q_q), constant: true, type_anno: None,
expr: Expression::new(ItemIdStore::new_id(), Value(qname!(Yolo, Swaggins))),
})
}
parse_test_wrap_ast! {
"thing::item::call()",
exst!(Call { f: bx![ex!(Value(qname!(thing, item, call)))], arguments: vec![] })
}
}
#[test]
fn reserved_words() {
parse_error!("module::item::call()");
}
#[test]
fn parsing_complicated_operators() {
parse_test_wrap_ast!("a <- b", exst!(binexp!("<-", val!("a"), val!("b"))));
parse_test_wrap_ast!("a || b", exst!(binexp!("||", val!("a"), val!("b"))));
parse_test_wrap_ast!("a<>b", exst!(binexp!("<>", val!("a"), val!("b"))));
parse_test_wrap_ast!("a.b.c.d", exst!(binexp!(".",
binexp!(".",
binexp!(".", val!("a"), val!("b")),
val!("c")),
val!("d"))));
parse_test_wrap_ast!("-3", exst!(prefexp!("-", NatLiteral(3))));
parse_test_wrap_ast!("-0.2", exst!(prefexp!("-", FloatLiteral(0.2))));
parse_test_wrap_ast!("!3", exst!(prefexp!("!", NatLiteral(3))));
parse_test_wrap_ast!("a <- -b", exst!(binexp!("<-", val!("a"), prefexp!("-", val!("b")))));
parse_test_wrap_ast!("a <--b", exst!(binexp!("<--", val!("a"), val!("b"))));
}
#[test]
fn parsing_functions() {
parse_test_wrap_ast!("fn oi()", decl!(FuncSig(Signature { name: rc!(oi), operator: false, params: vec![], type_anno: None })));
parse_test_wrap_ast!("oi()", exst!(Call { f: bx!(ex!(val!("oi"))), arguments: vec![] }));
parse_test_wrap_ast!("oi(a, 2 + 2)", exst!(Call
{ f: bx!(ex!(val!("oi"))),
arguments: vec![inv!(ex!(val!("a"))), inv!(ex!(binexp!("+", NatLiteral(2), NatLiteral(2)))).into()]
}));
parse_error!("a(b,,c)");
parse_test_wrap_ast!("fn a(b, c: Int): Int", decl!(
FuncSig(Signature { name: rc!(a), operator: false, params: vec![
FormalParam { name: rc!(b), anno: None, default: None },
FormalParam { name: rc!(c), anno: Some(ty!("Int")), default: None }
], type_anno: Some(ty!("Int")) })));
parse_test_wrap_ast!("fn a(x) { x() }", decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), anno: None, default: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(val!("x"))), arguments: vec![] })])));
parse_test_wrap_ast!("fn a(x) {\n x() }", decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), anno: None, default: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(val!("x"))), arguments: vec![] })])));
let multiline = r#"
fn a(x) {
x()
}
"#;
parse_test_wrap_ast!(multiline, decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), default: None, anno: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(val!("x"))), arguments: vec![] })])));
let multiline2 = r#"
fn a(x) {
x()
}
"#;
parse_test_wrap_ast!(multiline2, decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), default: None, anno: None }], type_anno: None },
vec![exst!(s "x()")])));
}
#[test]
fn functions_with_default_args() {
parse_test_wrap_ast! {
"fn func(x: Int, y: Int = 4) { }",
decl!(
FuncDecl(Signature { name: rc!(func), operator: false, type_anno: None, params: vec![
FormalParam { name: rc!(x), default: None, anno: Some(ty!("Int")) },
FormalParam { name: rc!(y), default: Some(ex!(s "4")), anno: Some(ty!("Int")) }
]}, vec![])
)
};
}
#[test]
fn parsing_bools() {
parse_test_wrap_ast!("false", exst!(BoolLiteral(false)));
parse_test_wrap_ast!("true", exst!(BoolLiteral(true)));
}
#[test]
fn parsing_strings() {
parse_test_wrap_ast!(r#""hello""#, exst!(StringLiteral(rc!(hello))));
}
#[test]
fn parsing_types() {
parse_test_wrap_ast!("type Yolo = Yolo", decl!(TypeDecl { name: tys!("Yolo"), body: TypeBody(vec![UnitStruct(rc!(Yolo))]), mutable: false} ));
parse_test_wrap_ast!("type mut Yolo = Yolo", decl!(TypeDecl { name: tys!("Yolo"), body: TypeBody(vec![UnitStruct(rc!(Yolo))]), mutable: true} ));
parse_test_wrap_ast!("type alias Sex = Drugs", decl!(TypeAlias { alias: rc!(Sex), original: rc!(Drugs) }));
parse_test_wrap_ast!("type Sanchez = Miguel | Alejandro(Int, Option<a>) | Esperanza { a: Int, b: String }",
decl!(TypeDecl {
name: tys!("Sanchez"),
body: TypeBody(vec![
UnitStruct(rc!(Miguel)),
TupleStruct(rc!(Alejandro), vec![
Singleton(TypeSingletonName { name: rc!(Int), params: vec![] }),
Singleton(TypeSingletonName { name: rc!(Option), params: vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })] }),
]),
Record{
name: rc!(Esperanza),
members: vec![
(rc!(a), Singleton(TypeSingletonName { name: rc!(Int), params: vec![] })),
(rc!(b), Singleton(TypeSingletonName { name: rc!(String), params: vec![] })),
]
}
]),
mutable: false
}));
parse_test_wrap_ast! {
"type Jorge<a> = Diego | Kike(a)",
decl!(TypeDecl{
name: TypeSingletonName { name: rc!(Jorge), params: vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })] },
body: TypeBody(vec![UnitStruct(rc!(Diego)), TupleStruct(rc!(Kike), vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })])]),
mutable: false
}
)
};
}
#[test]
fn parsing_bindings() {
parse_test_wrap_ast!("let mut a = 10", decl!(Binding { name: rc!(a), constant: false, type_anno: None, expr: ex!(NatLiteral(10)) } ));
parse_test_wrap_ast!("let a = 2 + 2", decl!(Binding { name: rc!(a), constant: true, type_anno: None, expr: ex!(binexp!("+", NatLiteral(2), NatLiteral(2))) }));
parse_test_wrap_ast!("let a: Nat = 2 + 2", decl!(
Binding { name: rc!(a), constant: true, type_anno: Some(Singleton(TypeSingletonName { name: rc!(Nat), params: vec![] })),
expr: ex!(binexp!("+", NatLiteral(2), NatLiteral(2))) }
));
}
#[test]
fn parsing_block_expressions() {
parse_test_wrap_ast! {
"if a() then { b(); c() }", exst!(
IfExpression {
discriminator: Some(bx! {
ex!(Call { f: bx!(ex!(val!("a"))), arguments: vec![]})
}),
body: bx! {
IfExpressionBody::SimpleConditional {
then_case: vec![exst!(Call { f: bx!(ex!(val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(val!("c"))), arguments: vec![] })],
else_case: None,
}
}
}
)
};
parse_test_wrap_ast! {
"if a() then { b(); c() } else { q }", exst!(
IfExpression {
discriminator: Some(bx! {
ex!(Call { f: bx!(ex!(val!("a"))), arguments: vec![]})
}),
body: bx! {
IfExpressionBody::SimpleConditional {
then_case: vec![exst!(Call { f: bx!(ex!(val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(val!("c"))), arguments: vec![] })],
else_case: Some(vec![exst!(val!("q"))]),
}
}
}
)
};
/*
parse_test!("if a() then { b(); c() }", AST(vec![exst!(
IfExpression(bx!(ex!(Call { f: bx!(ex!(val!("a"))), arguments: vec![]})),
vec![exst!(Call { f: bx!(ex!(val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(val!("c"))), arguments: vec![] })],
None)
)]));
parse_test!(r#"
if true then {
const a = 10
b
} else {
c
}"#,
AST(vec![exst!(IfExpression(bx!(ex!(BoolLiteral(true))),
vec![decl!(Binding { name: rc!(a), constant: true, expr: ex!(NatLiteral(10)) }),
exst!(val!(rc!(b)))],
Some(vec![exst!(val!(rc!(c)))])))])
);
parse_test!("if a { b } else { c }", AST(vec![exst!(
IfExpression(bx!(ex!(val!("a"))),
vec![exst!(val!("b"))],
Some(vec![exst!(val!("c"))])))]));
parse_test!("if (A {a: 1}) { b } else { c }", AST(vec![exst!(
IfExpression(bx!(ex!(NamedStruct { name: rc!(A), fields: vec![(rc!(a), ex!(NatLiteral(1)))]})),
vec![exst!(val!("b"))],
Some(vec![exst!(val!("c"))])))]));
parse_error!("if A {a: 1} { b } else { c }");
*/
}
#[test]
fn parsing_interfaces() {
parse_test_wrap_ast!("interface Unglueable { fn unglue(a: Glue); fn mar(): Glue }",
decl!(Interface {
name: rc!(Unglueable),
signatures: vec![
Signature {
name: rc!(unglue),
operator: false,
params: vec![
FormalParam { name: rc!(a), anno: Some(Singleton(TypeSingletonName { name: rc!(Glue), params: vec![] })), default: None }
],
type_anno: None
},
Signature { name: rc!(mar), operator: false, params: vec![], type_anno: Some(Singleton(TypeSingletonName { name: rc!(Glue), params: vec![] })) },
]
})
);
}
#[test]
fn parsing_impls() {
parse_test_wrap_ast!("impl Heh { fn yolo(); fn swagg(); }",
decl!(Impl {
type_name: ty!("Heh"),
interface_name: None,
block: vec![
FuncSig(Signature { name: rc!(yolo), operator: false, params: vec![], type_anno: None }),
FuncSig(Signature { name: rc!(swagg), operator: false, params: vec![], type_anno: None })
] }));
parse_test_wrap_ast!("impl Mondai for Lollerino { fn yolo(); fn swagg(); }",
decl!(Impl {
type_name: ty!("Lollerino"),
interface_name: Some(TypeSingletonName { name: rc!(Mondai), params: vec![] }),
block: vec![
FuncSig(Signature { name: rc!(yolo), operator: false, params: vec![], type_anno: None}),
FuncSig(Signature { name: rc!(swagg), operator: false, params: vec![], type_anno: None })
] }));
parse_test_wrap_ast!("impl Hella<T> for (Alpha, Omega) { }",
decl!(Impl {
type_name: Tuple(vec![ty!("Alpha"), ty!("Omega")]),
interface_name: Some(TypeSingletonName { name: rc!(Hella), params: vec![ty!("T")] }),
block: vec![]
})
);
parse_test_wrap_ast!("impl Option<WTFMate> { fn oi() }",
decl!(Impl {
type_name: Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("WTFMate")]}),
interface_name: None,
block: vec![
FuncSig(Signature { name: rc!(oi), operator: false, params: vec![], type_anno: None }),
]
}));
}
#[test]
fn parsing_type_annotations() {
parse_test_wrap_ast!("let a = b : Int",
decl!(Binding { name: rc!(a), constant: true, type_anno: None, expr:
ex!(val!("b"), ty!("Int")) }));
parse_test_wrap_ast!("a : Int",
exst!(val!("a"), ty!("Int"))
);
parse_test_wrap_ast!("a : Option<Int>",
exst!(val!("a"), Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("Int")] }))
);
parse_test_wrap_ast!("a : KoreanBBQSpecifier<Kimchi, Option<Bulgogi> >",
exst!(val!("a"), Singleton(TypeSingletonName { name: rc!(KoreanBBQSpecifier), params: vec![
ty!("Kimchi"), Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("Bulgogi")] })
] }))
);
parse_test_wrap_ast!("a : (Int, Yolo<a>)",
exst!(val!("a"), Tuple(
vec![ty!("Int"), Singleton(TypeSingletonName {
name: rc!(Yolo), params: vec![ty!("a")]
})])));
}
#[test]
fn parsing_lambdas() {
parse_test_wrap_ast! { r#"\(x) { x + 1}"#, exst!(
Lambda { params: vec![FormalParam { name: rc!(x), anno: None, default: None } ], type_anno: None, body: vec![exst!(s "x + 1")] }
)
}
parse_test_wrap_ast!(r#"\ (x: Int, y) { a;b;c;}"#,
exst!(Lambda {
params: vec![
FormalParam { name: rc!(x), anno: Some(ty!("Int")), default: None },
FormalParam { name: rc!(y), anno: None, default: None }
],
type_anno: None,
body: vec![exst!(s "a"), exst!(s "b"), exst!(s "c")]
})
);
parse_test_wrap_ast! { r#"\(x){y}(1)"#,
exst!(Call { f: bx!(ex!(
Lambda {
params: vec![
FormalParam { name: rc!(x), anno: None, default: None }
],
type_anno: None,
body: vec![exst!(s "y")] }
)),
arguments: vec![inv!(ex!(NatLiteral(1))).into()] })
};
parse_test_wrap_ast! {
r#"\(x: Int): String { "q" }"#,
exst!(Lambda {
params: vec![
FormalParam { name: rc!(x), anno: Some(ty!("Int")), default: None },
],
type_anno: Some(ty!("String")),
body: vec![exst!(s r#""q""#)]
})
}
}
#[test]
fn single_param_lambda() {
parse_test_wrap_ast! {
r"\x { x + 10 }",
exst!(Lambda {
params: vec![FormalParam { name: rc!(x), anno: None, default: None }],
type_anno: None,
body: vec![exst!(s r"x + 10")]
})
}
parse_test_wrap_ast! {
r"\x: Nat { x + 10 }",
exst!(Lambda {
params: vec![FormalParam { name: rc!(x), anno: Some(ty!("Nat")), default: None }],
type_anno: None,
body: vec![exst!(s r"x + 10")]
})
}
}
#[test]
fn more_advanced_lambdas() {
parse_test! {
r#"fn wahoo() { let a = 10; \(x) { x + a } };
wahoo()(3) "#,
AST {
id: ItemIdStore::new_id(),
statements: vec![
exst!(s r"fn wahoo() { let a = 10; \(x) { x + a } }"),
exst! {
Call {
f: bx!(ex!(Call { f: bx!(ex!(val!("wahoo"))), arguments: vec![] })),
arguments: vec![inv!(ex!(NatLiteral(3))).into()],
}
}
]
}
}
}
#[test]
fn list_literals() {
parse_test_wrap_ast! {
"[1,2]",
exst!(ListLiteral(vec![ex!(NatLiteral(1)), ex!(NatLiteral(2))]))
};
}
#[test]
fn while_expr() {
parse_test_wrap_ast! {
"while { }",
exst!(WhileExpression { condition: None, body: vec![] })
}
parse_test_wrap_ast! {
"while a == b { }",
exst!(WhileExpression { condition: Some(bx![ex![binexp!("==", val!("a"), val!("b"))]]), body: vec![] })
}
}
#[test]
fn for_expr() {
parse_test_wrap_ast! {
"for { a <- maybeValue } return 1",
exst!(ForExpression {
enumerators: vec![Enumerator { id: rc!(a), generator: ex!(val!("maybeValue")) }],
body: bx!(MonadicReturn(ex!(s "1")))
})
}
parse_test_wrap_ast! {
"for n <- someRange { f(n); }",
exst!(ForExpression { enumerators: vec![Enumerator { id: rc!(n), generator: ex!(val!("someRange"))}],
body: bx!(ForBody::StatementBlock(vec![exst!(s "f(n)")]))
})
}
}
#[test]
fn patterns() {
parse_test_wrap_ast! {
"if x is Some(a) then { 4 } else { 9 }", exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::TupleStruct(qname!(Some), vec![Pattern::VarOrName(qname!(a))]),
then_case: vec![exst!(s "4")],
else_case: Some(vec![exst!(s "9")]) })
}
)
}
parse_test_wrap_ast! {
"if x is Some(a) then 4 else 9", exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::TupleStruct(qname!(Some), vec![Pattern::VarOrName(qname!(a))]),
then_case: vec![exst!(s "4")],
else_case: Some(vec![exst!(s "9")]) }
)
}
)
}
parse_test_wrap_ast! {
"if x is Something { a, b: x } then { 4 } else { 9 }", exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::Record(qname!(Something), vec![
(rc!(a),Pattern::Literal(PatternLiteral::StringPattern(rc!(a)))),
(rc!(b),Pattern::VarOrName(qname!(x)))
]),
then_case: vec![exst!(s "4")],
else_case: Some(vec![exst!(s "9")])
}
)
}
)
}
}
#[test]
fn pattern_literals() {
parse_test_wrap_ast! {
"if x is -1 then 1 else 2",
exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::Literal(PatternLiteral::NumPattern { neg: true, num: NatLiteral(1) }),
then_case: vec![exst!(NatLiteral(1))],
else_case: Some(vec![exst!(NatLiteral(2))]),
})
}
)
}
parse_test_wrap_ast! {
"if x is 1 then 1 else 2",
exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::Literal(PatternLiteral::NumPattern { neg: false, num: NatLiteral(1) }),
then_case: vec![exst!(s "1")],
else_case: Some(vec![exst!(s "2")]),
})
}
)
}
parse_test_wrap_ast! {
"if x is true then 1 else 2",
exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(
IfExpressionBody::SimplePatternMatch {
pattern: Pattern::Literal(PatternLiteral::BoolPattern(true)),
then_case: vec![exst!(NatLiteral(1))],
else_case: Some(vec![exst!(NatLiteral(2))]),
})
}
)
}
parse_test_wrap_ast! {
"if x is \"gnosticism\" then 1 else 2",
exst!(
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::SimplePatternMatch {
pattern: Pattern::Literal(PatternLiteral::StringPattern(rc!(gnosticism))),
then_case: vec![exst!(s "1")],
else_case: Some(vec![exst!(s "2")]),
})
}
)
}
}
#[test]
fn imports() {
parse_test_wrap_ast! {
"import harbinger::draughts::Norgleheim",
import!(ImportSpecifier {
id: ItemIdStore::new_id(),
path_components: vec![rc!(harbinger), rc!(draughts), rc!(Norgleheim)],
imported_names: ImportedNames::LastOfPath
})
}
}
#[test]
fn imports_2() {
parse_test_wrap_ast! {
"import harbinger::draughts::{Norgleheim, Xraksenlaigar}",
import!(ImportSpecifier {
id: ItemIdStore::new_id(),
path_components: vec![rc!(harbinger), rc!(draughts)],
imported_names: ImportedNames::List(vec![
rc!(Norgleheim),
rc!(Xraksenlaigar)
])
})
}
}
#[test]
fn imports_3() {
parse_test_wrap_ast! {
"import bespouri::{}",
import!(ImportSpecifier {
id: ItemIdStore::new_id(),
path_components: vec![rc!(bespouri)],
imported_names: ImportedNames::List(vec![])
})
}
}
#[test]
fn imports_4() {
parse_test_wrap_ast! {
"import bespouri::*",
import!(ImportSpecifier {
id: ItemIdStore::new_id(),
path_components: vec![rc!(bespouri)],
imported_names: ImportedNames::All
})
}
}
#[test]
fn if_expr() {
parse_test_wrap_ast! {
"if x { is 1 then 5, else 20 }",
exst! {
IfExpression {
discriminator: Some(bx!(ex!(s "x"))),
body: bx!(IfExpressionBody::CondList(
vec![
ConditionArm {
condition: Condition::Pattern(Pattern::Literal(PatternLiteral::NumPattern { neg: false, num: NatLiteral(1)})),
guard: None,
body: vec![exst!(s "5")],
},
ConditionArm {
condition: Condition::Else,
guard: None,
body: vec![exst!(s "20")],
},
]
))
}
}
}
}
#[test]
fn modules() {
parse_test_wrap_ast! {
r#"
module ephraim {
let a = 10
fn nah() { 33 }
}
"#,
module!(
ModuleSpecifier { name: rc!(ephraim), contents: vec![
decl!(Binding { name: rc!(a), constant: true, type_anno: None, expr: ex!(s "10") }),
decl!(FuncDecl(Signature { name: rc!(nah), operator: false, params: vec![], type_anno: None }, vec![exst!(NatLiteral(33))])),
] }
)
}
}

View File

@ -0,0 +1,14 @@
let _SCHALA_VERSION = "0.1.0"
type Option<T> = Some(T) | None
type Ord = LT | EQ | GT
fn map(input: Option<T>, func: Func): Option<T> {
if input {
is Option::Some(x) then Option::Some(func(x)),
is Option::None then Option::None,
}
}
type Complicated = Sunrise | Metal { black: bool, norwegian: bool } | Fella(String, Int)

View File

@ -0,0 +1,544 @@
//! # Reduced AST
//! The reduced AST is a minimal AST designed to be built from the full AST after all possible
//! static checks have been done. Consequently, the AST reduction phase does very little error
//! checking itself - any errors should ideally be caught either by an earlier phase, or are
//! runtime errors that the evaluator should handle. That said, becuase it does do table lookups
//! that can in principle fail [especially at the moment with most static analysis not yet complete],
//! there is an Expr variant `ReductionError` to handle these cases.
//!
//! A design decision to make - should the ReducedAST types contain all information about
//! type/layout necessary for the evaluator to work? If so, then the evaluator should not
//! have access to the symbol table at all and ReducedAST should carry that information. If not,
//! then ReducedAST shouldn't be duplicating information that can be queried at runtime from the
//! symbol table. But I think the former might make sense since ultimately the bytecode will be
//! built from the ReducedAST.
use std::rc::Rc;
use std::str::FromStr;
use crate::ast::*;
use crate::symbol_table::{Symbol, SymbolSpec, SymbolTable, FullyQualifiedSymbolName};
use crate::builtin::Builtin;
use crate::util::deref_optional_box;
#[derive(Debug)]
pub struct ReducedAST(pub Vec<Stmt>);
#[derive(Debug, Clone)]
pub enum Stmt {
PreBinding {
name: Rc<String>,
func: Func,
},
Binding {
name: Rc<String>,
constant: bool,
expr: Expr,
},
Expr(Expr),
Noop,
}
#[derive(Debug, Clone)]
pub enum Expr {
Unit,
Lit(Lit),
Sym(Rc<String>), //a Sym is anything that can be looked up by name at runtime - i.e. a function or variable address
Tuple(Vec<Expr>),
Func(Func),
Constructor {
type_name: Rc<String>,
name: Rc<String>,
tag: usize,
arity: usize, // n.b. arity here is always the value from the symbol table - if it doesn't match what it's being called with, that's an eval error, eval will handle it
},
Call {
f: Box<Expr>,
args: Vec<Expr>,
},
Assign {
val: Box<Expr>, //TODO this probably can't be a val
expr: Box<Expr>,
},
Conditional {
cond: Box<Expr>,
then_clause: Vec<Stmt>,
else_clause: Vec<Stmt>,
},
ConditionalTargetSigilValue,
CaseMatch {
cond: Box<Expr>,
alternatives: Vec<Alternative>
},
UnimplementedSigilValue,
ReductionError(String),
}
pub type BoundVars = Vec<Option<Rc<String>>>; //remember that order matters here
#[derive(Debug, Clone)]
pub struct Alternative {
pub matchable: Subpattern,
pub item: Vec<Stmt>,
}
#[derive(Debug, Clone)]
pub struct Subpattern {
pub tag: Option<usize>,
pub subpatterns: Vec<Option<Subpattern>>,
pub bound_vars: BoundVars,
pub guard: Option<Expr>,
}
#[derive(Debug, Clone)]
pub enum Lit {
Nat(u64),
Int(i64),
Float(f64),
Bool(bool),
StringLit(Rc<String>),
}
#[derive(Debug, Clone)]
pub enum Func {
BuiltIn(Builtin),
UserDefined {
name: Option<Rc<String>>,
params: Vec<Rc<String>>,
body: Vec<Stmt>,
}
}
pub fn reduce(ast: &AST, symbol_table: &SymbolTable) -> ReducedAST {
let mut reducer = Reducer { symbol_table };
reducer.ast(ast)
}
struct Reducer<'a> {
symbol_table: &'a SymbolTable
}
impl<'a> Reducer<'a> {
fn ast(&mut self, ast: &AST) -> ReducedAST {
let mut output = vec![];
for statement in ast.statements.iter() {
output.push(self.statement(statement));
}
ReducedAST(output)
}
fn statement(&mut self, stmt: &Statement) -> Stmt {
match &stmt.kind {
StatementKind::Expression(expr) => Stmt::Expr(self.expression(&expr)),
StatementKind::Declaration(decl) => self.declaration(&decl),
StatementKind::Import(_) => Stmt::Noop,
StatementKind::Module(modspec) => {
for statement in modspec.contents.iter() {
self.statement(&statement);
}
Stmt::Noop
}
}
}
fn block(&mut self, block: &Block) -> Vec<Stmt> {
block.iter().map(|stmt| self.statement(stmt)).collect()
}
fn invocation_argument(&mut self, invoc: &InvocationArgument) -> Expr {
use crate::ast::InvocationArgument::*;
match invoc {
Positional(ex) => self.expression(ex),
Keyword { .. } => Expr::UnimplementedSigilValue,
Ignored => Expr::UnimplementedSigilValue,
}
}
fn expression(&mut self, expr: &Expression) -> Expr {
use crate::ast::ExpressionKind::*;
let symbol_table = self.symbol_table;
let ref input = expr.kind;
match input {
NatLiteral(n) => Expr::Lit(Lit::Nat(*n)),
FloatLiteral(f) => Expr::Lit(Lit::Float(*f)),
StringLiteral(s) => Expr::Lit(Lit::StringLit(s.clone())),
BoolLiteral(b) => Expr::Lit(Lit::Bool(*b)),
BinExp(binop, lhs, rhs) => self.binop(binop, lhs, rhs),
PrefixExp(op, arg) => self.prefix(op, arg),
Value(qualified_name) => self.value(qualified_name),
Call { f, arguments } => self.reduce_call_expression(f, arguments),
TupleLiteral(exprs) => Expr::Tuple(exprs.iter().map(|e| self.expression(e)).collect()),
IfExpression { discriminator, body } => self.reduce_if_expression(deref_optional_box(discriminator), body),
Lambda { params, body, .. } => self.reduce_lambda(params, body),
NamedStruct { name, fields } => self.reduce_named_struct(name, fields),
Index { .. } => Expr::UnimplementedSigilValue,
WhileExpression { .. } => Expr::UnimplementedSigilValue,
ForExpression { .. } => Expr::UnimplementedSigilValue,
ListLiteral { .. } => Expr::UnimplementedSigilValue,
}
}
fn value(&mut self, qualified_name: &QualifiedName) -> Expr {
let symbol_table = self.symbol_table;
let ref id = qualified_name.id;
let ref sym_name = match symbol_table.get_fqsn_from_id(id) {
Some(fqsn) => fqsn,
None => return Expr::ReductionError(format!("FQSN lookup for Value {:?} failed", qualified_name)),
};
//TODO this probably needs to change
let FullyQualifiedSymbolName(ref v) = sym_name;
let name = v.last().unwrap().name.clone();
let Symbol { local_name, spec, .. } = match symbol_table.lookup_by_fqsn(&sym_name) {
Some(s) => s,
//None => return Expr::ReductionError(format!("Symbol {:?} not found", sym_name)),
None => return Expr::Sym(name.clone())
};
match spec {
SymbolSpec::RecordConstructor { .. } => Expr::ReductionError(format!("AST reducer doesn't expect a RecordConstructor here")),
SymbolSpec::DataConstructor { index, type_args, type_name } => Expr::Constructor {
type_name: type_name.clone(),
name: name.clone(),
tag: index.clone(),
arity: type_args.len(),
},
SymbolSpec::Func(_) => Expr::Sym(local_name.clone()),
SymbolSpec::Binding => Expr::Sym(local_name.clone()), //TODO not sure if this is right, probably needs to eventually be fqsn
SymbolSpec::Type { .. } => Expr::ReductionError("AST reducer doesnt expect a type here".to_string())
}
}
fn reduce_lambda(&mut self, params: &Vec<FormalParam>, body: &Block) -> Expr {
Expr::Func(Func::UserDefined {
name: None,
params: params.iter().map(|param| param.name.clone()).collect(),
body: self.block(body),
})
}
fn reduce_named_struct(&mut self, name: &QualifiedName, fields: &Vec<(Rc<String>, Expression)>) -> Expr {
let symbol_table = self.symbol_table;
let ref sym_name = match symbol_table.get_fqsn_from_id(&name.id) {
Some(fqsn) => fqsn,
None => return Expr::ReductionError(format!("FQSN lookup for name {:?} failed", name)),
};
let FullyQualifiedSymbolName(ref v) = sym_name;
let ref name = v.last().unwrap().name;
let (type_name, index, members_from_table) = match symbol_table.lookup_by_fqsn(&sym_name) {
Some(Symbol { spec: SymbolSpec::RecordConstructor { members, type_name, index }, .. }) => (type_name.clone(), index, members),
_ => return Expr::ReductionError("Not a record constructor".to_string()),
};
let arity = members_from_table.len();
let mut args: Vec<(Rc<String>, Expr)> = fields.iter()
.map(|(name, expr)| (name.clone(), self.expression(expr)))
.collect();
args.as_mut_slice()
.sort_unstable_by(|(name1, _), (name2, _)| name1.cmp(name2)); //arbitrary - sorting by alphabetical order
let args = args.into_iter().map(|(_, expr)| expr).collect();
//TODO make sure this sorting actually works
let f = box Expr::Constructor { type_name, name: name.clone(), tag: *index, arity, };
Expr::Call { f, args }
}
fn reduce_call_expression(&mut self, func: &Expression, arguments: &Vec<InvocationArgument>) -> Expr {
Expr::Call {
f: Box::new(self.expression(func)),
args: arguments.iter().map(|arg| self.invocation_argument(arg)).collect(),
}
}
fn reduce_if_expression(&mut self, discriminator: Option<&Expression>, body: &IfExpressionBody) -> Expr {
let symbol_table = self.symbol_table;
let cond = Box::new(match discriminator {
Some(expr) => self.expression(expr),
None => return Expr::ReductionError(format!("blank cond if-expr not supported")),
});
match body {
IfExpressionBody::SimpleConditional { then_case, else_case } => {
let then_clause = self.block(&then_case);
let else_clause = match else_case.as_ref() {
None => vec![],
Some(stmts) => self.block(&stmts),
};
Expr::Conditional { cond, then_clause, else_clause }
},
IfExpressionBody::SimplePatternMatch { pattern, then_case, else_case } => {
let then_clause = self.block(&then_case);
let else_clause = match else_case.as_ref() {
None => vec![],
Some(stmts) => self.block(&stmts),
};
let alternatives = vec![
pattern.to_alternative(then_clause, symbol_table),
Alternative {
matchable: Subpattern {
tag: None,
subpatterns: vec![],
bound_vars: vec![],
guard: None,
},
item: else_clause
},
];
Expr::CaseMatch {
cond,
alternatives,
}
},
IfExpressionBody::CondList(ref condition_arms) => {
let mut alternatives = vec![];
for arm in condition_arms {
match arm.condition {
Condition::Expression(ref _expr) => {
return Expr::UnimplementedSigilValue
},
Condition::Pattern(ref p) => {
let item = self.block(&arm.body);
let alt = p.to_alternative(item, symbol_table);
alternatives.push(alt);
},
Condition::TruncatedOp(_, _) => {
return Expr::UnimplementedSigilValue
},
Condition::Else => {
return Expr::UnimplementedSigilValue
}
}
}
Expr::CaseMatch { cond, alternatives }
}
}
}
fn binop(&mut self, binop: &BinOp, lhs: &Box<Expression>, rhs: &Box<Expression>) -> Expr {
let operation = Builtin::from_str(binop.sigil()).ok();
match operation {
Some(Builtin::Assignment) => Expr::Assign {
val: Box::new(self.expression(&*lhs)),
expr: Box::new(self.expression(&*rhs)),
},
Some(op) => {
let f = Box::new(Expr::Func(Func::BuiltIn(op)));
Expr::Call { f, args: vec![self.expression(&*lhs), self.expression(&*rhs)] }
},
None => {
//TODO handle a user-defined operation
Expr::UnimplementedSigilValue
}
}
}
fn prefix(&mut self, prefix: &PrefixOp, arg: &Box<Expression>) -> Expr {
match prefix.builtin {
Some(op) => {
let f = Box::new(Expr::Func(Func::BuiltIn(op)));
Expr::Call { f, args: vec![self.expression(arg)] }
},
None => { //TODO need this for custom prefix ops
Expr::UnimplementedSigilValue
}
}
}
fn declaration(&mut self, declaration: &Declaration) -> Stmt {
use self::Declaration::*;
match declaration {
Binding {name, constant, expr, .. } => Stmt::Binding { name: name.clone(), constant: *constant, expr: self.expression(expr) },
FuncDecl(Signature { name, params, .. }, statements) => Stmt::PreBinding {
name: name.clone(),
func: Func::UserDefined {
name: Some(name.clone()),
params: params.iter().map(|param| param.name.clone()).collect(),
body: self.block(&statements),
}
},
TypeDecl { .. } => Stmt::Noop,
TypeAlias{ .. } => Stmt::Noop,
Interface { .. } => Stmt::Noop,
Impl { .. } => Stmt::Expr(Expr::UnimplementedSigilValue),
_ => Stmt::Expr(Expr::UnimplementedSigilValue)
}
}
}
/* ig var pat
* x is SomeBigOldEnum(_, x, Some(t))
*/
fn handle_symbol(symbol: Option<&Symbol>, inner_patterns: &Vec<Pattern>, symbol_table: &SymbolTable) -> Subpattern {
use self::Pattern::*;
let tag = symbol.map(|symbol| match symbol.spec {
SymbolSpec::DataConstructor { index, .. } => index.clone(),
_ => panic!("Symbol is not a data constructor - this should've been caught in type-checking"),
});
let bound_vars = inner_patterns.iter().map(|p| match p {
VarOrName(qualified_name) => {
let fqsn = symbol_table.get_fqsn_from_id(&qualified_name.id);
let symbol_exists = fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)).is_some();
if symbol_exists {
None
} else {
let QualifiedName { components, .. } = qualified_name;
if components.len() == 1 {
Some(components[0].clone())
} else {
panic!("Bad variable name in pattern");
}
}
},
_ => None,
}).collect();
let subpatterns = inner_patterns.iter().map(|p| match p {
Ignored => None,
VarOrName(_) => None,
Literal(other) => Some(other.to_subpattern(symbol_table)),
tp @ TuplePattern(_) => Some(tp.to_subpattern(symbol_table)),
ts @ TupleStruct(_, _) => Some(ts.to_subpattern(symbol_table)),
Record(..) => unimplemented!(),
}).collect();
let guard = None;
/*
let guard_equality_exprs: Vec<Expr> = subpatterns.iter().map(|p| match p {
Literal(lit) => match lit {
_ => unimplemented!()
},
_ => unimplemented!()
}).collect();
*/
Subpattern {
tag,
subpatterns,
guard,
bound_vars,
}
}
impl Pattern {
fn to_alternative(&self, item: Vec<Stmt>, symbol_table: &SymbolTable) -> Alternative {
let s = self.to_subpattern(symbol_table);
Alternative {
matchable: Subpattern {
tag: s.tag,
subpatterns: s.subpatterns,
bound_vars: s.bound_vars,
guard: s.guard,
},
item
}
}
fn to_subpattern(&self, symbol_table: &SymbolTable) -> Subpattern {
use self::Pattern::*;
match self {
TupleStruct(QualifiedName{ components, id }, inner_patterns) => {
let fqsn = symbol_table.get_fqsn_from_id(&id);
match fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)) {
Some(symbol) => handle_symbol(Some(symbol), inner_patterns, symbol_table),
None => {
panic!("Symbol {:?} not found", components);
}
}
},
TuplePattern(inner_patterns) => handle_symbol(None, inner_patterns, symbol_table),
Record(_name, _pairs) => {
unimplemented!()
},
Ignored => Subpattern { tag: None, subpatterns: vec![], guard: None, bound_vars: vec![] },
Literal(lit) => lit.to_subpattern(symbol_table),
VarOrName(QualifiedName { components, id }) => {
// if fqsn is Some, treat this as a symbol pattern. If it's None, treat it
// as a variable.
let fqsn = symbol_table.get_fqsn_from_id(&id);
match fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)) {
Some(symbol) => handle_symbol(Some(symbol), &vec![], symbol_table),
None => {
let name = if components.len() == 1 {
components[0].clone()
} else {
panic!("check this line of code yo");
};
Subpattern {
tag: None,
subpatterns: vec![],
guard: None,
bound_vars: vec![Some(name.clone())],
}
}
}
},
}
}
}
impl PatternLiteral {
fn to_subpattern(&self, _symbol_table: &SymbolTable) -> Subpattern {
use self::PatternLiteral::*;
match self {
NumPattern { neg, num } => {
let comparison = Expr::Lit(match (neg, num) {
(false, ExpressionKind::NatLiteral(n)) => Lit::Nat(*n),
(false, ExpressionKind::FloatLiteral(f)) => Lit::Float(*f),
(true, ExpressionKind::NatLiteral(n)) => Lit::Int(-1*(*n as i64)),
(true, ExpressionKind::FloatLiteral(f)) => Lit::Float(-1.0*f),
_ => panic!("This should never happen")
});
let guard = Some(Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::Equality))),
args: vec![comparison, Expr::ConditionalTargetSigilValue],
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
StringPattern(s) => {
let guard = Some(Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::Equality))),
args: vec![Expr::Lit(Lit::StringLit(s.clone())), Expr::ConditionalTargetSigilValue]
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
BoolPattern(b) => {
let guard = Some(if *b {
Expr::ConditionalTargetSigilValue
} else {
Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::BooleanNot))),
args: vec![Expr::ConditionalTargetSigilValue]
}
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
}
}
}

View File

@ -0,0 +1,338 @@
use stopwatch::Stopwatch;
use std::time::Duration;
use std::cell::RefCell;
use std::rc::Rc;
use std::collections::HashSet;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface,
ComputationRequest, ComputationResponse,
LangMetaRequest, LangMetaResponse, GlobalOutputStats,
DebugResponse, DebugAsk};
use crate::{ast, reduced_ast, tokenizing, parsing, eval, typechecking, symbol_table, source_map};
pub type SymbolTableHandle = Rc<RefCell<symbol_table::SymbolTable>>;
pub type SourceMapHandle = Rc<RefCell<source_map::SourceMap>>;
/// All the state necessary to parse and execute a Schala program are stored in this struct.
/// `state` represents the execution state for the AST-walking interpreter, the other fields
/// should be self-explanatory.
pub struct Schala {
source_reference: SourceReference,
source_map: SourceMapHandle,
state: eval::State<'static>,
symbol_table: SymbolTableHandle,
resolver: crate::scope_resolution::ScopeResolver<'static>,
type_context: typechecking::TypeContext<'static>,
active_parser: parsing::Parser,
}
impl Schala {
fn handle_docs(&self, source: String) -> LangMetaResponse {
LangMetaResponse::Docs {
doc_string: format!("Schala item `{}` : <<Schala-lang documentation not yet implemented>>", source)
}
}
}
impl Schala {
/// Creates a new Schala environment *without* any prelude.
fn new_blank_env() -> Schala {
let source_map = Rc::new(RefCell::new(source_map::SourceMap::new()));
let symbols = Rc::new(RefCell::new(symbol_table::SymbolTable::new(source_map.clone())));
Schala {
//TODO maybe these can be the same structure
source_reference: SourceReference::new(),
symbol_table: symbols.clone(),
source_map: source_map.clone(),
resolver: crate::scope_resolution::ScopeResolver::new(symbols.clone()),
state: eval::State::new(),
type_context: typechecking::TypeContext::new(),
active_parser: parsing::Parser::new(source_map)
}
}
/// Creates a new Schala environment with the standard prelude, which is defined as ordinary
/// Schala code in the file `prelude.schala`
pub fn new() -> Schala {
let prelude = include_str!("prelude.schala");
let mut s = Schala::new_blank_env();
let request = ComputationRequest { source: prelude, debug_requests: HashSet::default() };
let response = s.run_computation(request);
if let Err(msg) = response.main_output {
panic!("Error in prelude, panicking: {}", msg);
}
s
}
fn handle_debug_immediate(&self, request: DebugAsk) -> DebugResponse {
use DebugAsk::*;
match request {
Timing => DebugResponse { ask: Timing, value: format!("Invalid") },
ByStage { stage_name, token } => match &stage_name[..] {
"symbol-table" => {
let value = self.symbol_table.borrow().debug_symbol_table();
DebugResponse {
ask: ByStage { stage_name: format!("symbol-table"), token },
value
}
},
s => {
DebugResponse {
ask: ByStage { stage_name: s.to_string(), token: None },
value: format!("Not-implemented")
}
}
}
}
}
}
fn tokenizing(input: &str, _handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<Vec<tokenizing::Token>, String> {
let tokens = tokenizing::tokenize(input);
comp.map(|comp| {
let token_string = tokens.iter().map(|t| t.to_string_with_metadata()).join(", ");
comp.add_artifact(token_string);
});
let errors: Vec<String> = tokens.iter().filter_map(|t| t.get_error()).collect();
if errors.len() == 0 {
Ok(tokens)
} else {
Err(format!("{:?}", errors))
}
}
fn parsing(input: Vec<tokenizing::Token>, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
use ParsingDebugType::*;
let ref mut parser = handle.active_parser;
parser.add_new_tokens(input);
let ast = parser.parse();
comp.map(|comp| {
let debug_format = comp.parsing.as_ref().unwrap_or(&CompactAST);
let debug_info = match debug_format {
CompactAST => match ast{
Ok(ref ast) => ast.compact_debug(),
Err(_) => "Error - see output".to_string(),
},
ExpandedAST => match ast{
Ok(ref ast) => ast.expanded_debug(),
Err(_) => "Error - see output".to_string(),
},
Trace => parser.format_parse_trace(),
};
comp.add_artifact(debug_info);
});
ast.map_err(|err| format_parse_error(err, &handle.source_reference))
}
fn format_parse_error(error: parsing::ParseError, source_reference: &SourceReference) -> String {
let line_num = error.token.location.line_num;
let ch = error.token.location.char_num;
let line_from_program = source_reference.get_line(line_num);
let location_pointer = format!("{}^", " ".repeat(ch));
let line_num_digits = format!("{}", line_num).chars().count();
let space_padding = " ".repeat(line_num_digits);
let production = match error.production_name {
Some(n) => format!("\n(from production \"{}\")", n),
None => "".to_string()
};
format!(r#"
{error_msg}{production}
{space_padding} |
{line_num} | {}
{space_padding} | {}
"#, line_from_program, location_pointer, error_msg=error.msg, space_padding=space_padding, line_num=line_num, production=production
)
}
fn symbol_table(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let () = handle.symbol_table.borrow_mut().add_top_level_symbols(&input)?;
comp.map(|comp| {
let debug = handle.symbol_table.borrow().debug_symbol_table();
comp.add_artifact(debug);
});
Ok(input)
}
fn scope_resolution(mut input: ast::AST, handle: &mut Schala, _com: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let () = handle.resolver.resolve(&mut input)?;
Ok(input)
}
fn typechecking(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let result = handle.type_context.typecheck(&input);
comp.map(|comp| {
comp.add_artifact(match result {
Ok(ty) => ty.to_string(),
Err(err) => format!("Type error: {}", err.msg)
});
});
Ok(input)
}
fn ast_reducing(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<reduced_ast::ReducedAST, String> {
let ref symbol_table = handle.symbol_table.borrow();
let output = reduced_ast::reduce(&input, symbol_table);
comp.map(|comp| comp.add_artifact(format!("{:?}", output)));
Ok(output)
}
fn eval(input: reduced_ast::ReducedAST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<String, String> {
comp.map(|comp| comp.add_artifact(handle.state.debug_print()));
let evaluation_outputs = handle.state.evaluate(input, true);
let text_output: Result<Vec<String>, String> = evaluation_outputs
.into_iter()
.collect();
let eval_output: Result<String, String> = text_output
.map(|v| { v.into_iter().intersperse(format!("\n")).collect() });
eval_output
}
/// Represents lines of source code
struct SourceReference {
lines: Option<Vec<String>>
}
impl SourceReference {
fn new() -> SourceReference {
SourceReference { lines: None }
}
fn load_new_source(&mut self, source: &str) {
//TODO this is a lot of heap allocations - maybe there's a way to make it more efficient?
self.lines = Some(source.lines().map(|s| s.to_string()).collect()); }
fn get_line(&self, line: usize) -> String {
self.lines.as_ref().and_then(|x| x.get(line).map(|s| s.to_string())).unwrap_or(format!("NO LINE FOUND"))
}
}
enum ParsingDebugType {
CompactAST,
ExpandedAST,
Trace
}
#[derive(Default)]
struct PassDebugArtifact {
parsing: Option<ParsingDebugType>,
artifacts: Vec<String>
}
impl PassDebugArtifact {
fn add_artifact(&mut self, artifact: String) {
self.artifacts.push(artifact)
}
}
fn stage_names() -> Vec<&'static str> {
vec![
"tokenizing",
"parsing",
"symbol-table",
"scope-resolution",
"typechecking",
"ast-reduction",
"ast-walking-evaluation"
]
}
impl ProgrammingLanguageInterface for Schala {
fn get_language_name(&self) -> String { format!("Schala") }
fn get_source_file_suffix(&self) -> String { format!("schala") }
fn run_computation(&mut self, request: ComputationRequest) -> ComputationResponse {
struct PassToken<'a> {
schala: &'a mut Schala,
stage_durations: &'a mut Vec<(String, Duration)>,
sw: &'a Stopwatch,
debug_requests: &'a HashSet<DebugAsk>,
debug_responses: &'a mut Vec<DebugResponse>,
}
fn output_wrapper<Input, Output, F>(n: usize, func: F, input: Input, token: &mut PassToken) -> Result<Output, String>
where F: Fn(Input, &mut Schala, Option<&mut PassDebugArtifact>) -> Result<Output, String>
{
let stage_names = stage_names();
let cur_stage_name = stage_names[n];
let ask = token.debug_requests.iter().find(|ask| ask.is_for_stage(cur_stage_name));
let parsing = match ask {
Some(DebugAsk::ByStage { token, .. }) if cur_stage_name == "parsing" => Some(
token.as_ref().map(|token| match &token[..] {
"compact" => ParsingDebugType::CompactAST,
"expanded" => ParsingDebugType::ExpandedAST,
"trace" => ParsingDebugType::Trace,
_ => ParsingDebugType::CompactAST,
}).unwrap_or(ParsingDebugType::CompactAST)
),
_ => None,
};
let mut debug_artifact = ask.map(|_| PassDebugArtifact {
parsing, ..Default::default()
});
let output = func(input, token.schala, debug_artifact.as_mut());
//TODO I think this is not counting the time since the *previous* stage
token.stage_durations.push((cur_stage_name.to_string(), token.sw.elapsed()));
if let Some(artifact) = debug_artifact {
for value in artifact.artifacts.into_iter() {
let resp = DebugResponse { ask: ask.unwrap().clone(), value };
token.debug_responses.push(resp);
}
}
output
}
let ComputationRequest { source, debug_requests } = request;
self.source_reference.load_new_source(source);
let sw = Stopwatch::start_new();
let mut stage_durations = Vec::new();
let mut debug_responses = Vec::new();
let mut tok = PassToken { schala: self, stage_durations: &mut stage_durations, sw: &sw, debug_requests: &debug_requests, debug_responses: &mut debug_responses };
let main_output: Result<String, String> = Ok(source)
.and_then(|source| output_wrapper(0, tokenizing, source, &mut tok))
.and_then(|tokens| output_wrapper(1, parsing, tokens, &mut tok))
.and_then(|ast| output_wrapper(2, symbol_table, ast, &mut tok))
.and_then(|ast| output_wrapper(3, scope_resolution, ast, &mut tok))
.and_then(|ast| output_wrapper(4, typechecking, ast, &mut tok))
.and_then(|ast| output_wrapper(5, ast_reducing, ast, &mut tok))
.and_then(|reduced_ast| output_wrapper(6, eval, reduced_ast, &mut tok));
let total_duration = sw.elapsed();
let global_output_stats = GlobalOutputStats {
total_duration, stage_durations
};
ComputationResponse {
main_output,
global_output_stats,
debug_responses,
}
}
fn request_meta(&mut self, request: LangMetaRequest) -> LangMetaResponse {
match request {
LangMetaRequest::StageNames => LangMetaResponse::StageNames(stage_names().iter().map(|s| s.to_string()).collect()),
LangMetaRequest::Docs { source } => self.handle_docs(source),
LangMetaRequest::ImmediateDebug(debug_request) =>
LangMetaResponse::ImmediateDebug(self.handle_debug_immediate(debug_request)),
LangMetaRequest::Custom { .. } => LangMetaResponse::Custom { kind: format!("not-implemented"), value: format!("") }
}
}
}

View File

@ -0,0 +1,119 @@
use std::rc::Rc;
use crate::schala::SymbolTableHandle;
use crate::symbol_table::{ScopeSegment, FullyQualifiedSymbolName};
use crate::ast::*;
use crate::util::ScopeStack;
type FQSNPrefix = Vec<ScopeSegment>;
pub struct ScopeResolver<'a> {
symbol_table_handle: SymbolTableHandle,
name_scope_stack: ScopeStack<'a, Rc<String>, FQSNPrefix>,
}
impl<'a> ASTVisitor for ScopeResolver<'a> {
//TODO need to un-insert these - maybe need to rethink visitor
fn import(&mut self, import_spec: &ImportSpecifier) {
let ref symbol_table = self.symbol_table_handle.borrow();
let ImportSpecifier { ref path_components, ref imported_names, .. } = &import_spec;
match imported_names {
ImportedNames::All => {
let prefix = FullyQualifiedSymbolName(path_components.iter().map(|c| ScopeSegment {
name: c.clone(),
}).collect());
let members = symbol_table.lookup_children_of_fqsn(&prefix);
for member in members.into_iter() {
let local_name = member.0.last().unwrap().name.clone();
self.name_scope_stack.insert(local_name.clone(), member.0);
}
},
ImportedNames::LastOfPath => {
let name = path_components.last().unwrap(); //TODO handle better
let fqsn_prefix = path_components.iter().map(|c| ScopeSegment {
name: c.clone(),
}).collect();
self.name_scope_stack.insert(name.clone(), fqsn_prefix);
}
ImportedNames::List(ref names) => {
let fqsn_prefix: FQSNPrefix = path_components.iter().map(|c| ScopeSegment {
name: c.clone(),
}).collect();
for name in names.iter() {
self.name_scope_stack.insert(name.clone(), fqsn_prefix.clone());
}
}
};
}
fn qualified_name(&mut self, qualified_name: &QualifiedName) {
let ref mut symbol_table = self.symbol_table_handle.borrow_mut();
let fqsn = self.lookup_name_in_scope(&qualified_name);
let ref id = qualified_name.id;
symbol_table.map_id_to_fqsn(id, fqsn);
}
fn named_struct(&mut self, name: &QualifiedName, _fields: &Vec<(Rc<String>, Expression)>) {
let ref mut symbol_table = self.symbol_table_handle.borrow_mut();
let ref id = name.id;
let fqsn = self.lookup_name_in_scope(&name);
symbol_table.map_id_to_fqsn(id, fqsn);
}
fn pattern(&mut self, pat: &Pattern) {
use Pattern::*;
match pat {
Ignored => (),
TuplePattern(_) => (),
Literal(_) => (),
TupleStruct(name, _) => self.qualified_name_in_pattern(name),
Record(name, _) => self.qualified_name_in_pattern(name),
VarOrName(name) => self.qualified_name_in_pattern(name),
};
}
}
impl<'a> ScopeResolver<'a> {
pub fn new(symbol_table_handle: SymbolTableHandle) -> ScopeResolver<'static> {
let name_scope_stack = ScopeStack::new(None);
ScopeResolver { symbol_table_handle, name_scope_stack }
}
pub fn resolve(&mut self, ast: &mut AST) -> Result<(), String> {
walk_ast(self, ast);
Ok(())
}
fn lookup_name_in_scope(&self, sym_name: &QualifiedName) -> FullyQualifiedSymbolName {
let QualifiedName { components, .. } = sym_name;
let first_component = &components[0];
match self.name_scope_stack.lookup(first_component) {
None => {
FullyQualifiedSymbolName(components.iter().map(|name| ScopeSegment { name: name.clone() }).collect())
},
Some(fqsn_prefix) => {
let mut full_name = fqsn_prefix.clone();
let rest_of_name: FQSNPrefix = components[1..].iter().map(|name| ScopeSegment { name: name.clone() }).collect();
full_name.extend_from_slice(&rest_of_name);
FullyQualifiedSymbolName(full_name)
}
}
}
/// this might be a variable or a pattern. if a variable, set to none
fn qualified_name_in_pattern(&mut self, qualified_name: &QualifiedName) {
let ref mut symbol_table = self.symbol_table_handle.borrow_mut();
let ref id = qualified_name.id;
let fqsn = self.lookup_name_in_scope(qualified_name);
if symbol_table.lookup_by_fqsn(&fqsn).is_some() {
symbol_table.map_id_to_fqsn(&id, fqsn);
}
}
}
#[cfg(test)]
mod tests {
#[test]
fn basic_scope() {
}
}

View File

@ -0,0 +1,39 @@
use std::collections::HashMap;
use std::fmt;
use crate::ast::ItemId;
pub type LineNumber = usize;
#[derive(Debug, Clone, Copy, PartialEq)]
pub struct Location {
pub line_num: LineNumber,
pub char_num: usize,
}
impl fmt::Display for Location {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{}:{}", self.line_num, self.char_num)
}
}
pub struct SourceMap {
map: HashMap<ItemId, Location>
}
impl SourceMap {
pub fn new() -> SourceMap {
SourceMap { map: HashMap::new() }
}
pub fn add_location(&mut self, id: &ItemId, loc: Location) {
self.map.insert(id.clone(), loc);
}
pub fn lookup(&self, id: &ItemId) -> Option<Location> {
match self.map.get(id) {
Some(loc) => Some(loc.clone()),
None => None
}
}
}

View File

@ -0,0 +1,343 @@
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::rc::Rc;
use std::fmt;
use std::fmt::Write;
use crate::schala::SourceMapHandle;
use crate::source_map::{SourceMap, LineNumber};
use crate::ast;
use crate::ast::{ItemId, TypeBody, TypeSingletonName, Signature, Statement, StatementKind, ModuleSpecifier};
use crate::typechecking::TypeName;
#[allow(unused_macros)]
macro_rules! fqsn {
( $( $name:expr ; $kind:tt),* ) => {
{
let mut vec = vec![];
$(
vec.push(crate::symbol_table::ScopeSegment::new(std::rc::Rc::new($name.to_string())));
)*
FullyQualifiedSymbolName(vec)
}
};
}
mod symbol_trie;
use symbol_trie::SymbolTrie;
mod test;
/// Keeps track of what names were used in a given namespace. Call try_register to add a name to
/// the table, or report an error if a name already exists.
struct DuplicateNameTrackTable {
table: HashMap<Rc<String>, LineNumber>,
}
impl DuplicateNameTrackTable {
fn new() -> DuplicateNameTrackTable {
DuplicateNameTrackTable { table: HashMap::new() }
}
fn try_register(&mut self, name: &Rc<String>, id: &ItemId, source_map: &SourceMap) -> Result<(), LineNumber> {
match self.table.entry(name.clone()) {
Entry::Occupied(o) => {
let line_number = o.get();
Err(*line_number)
},
Entry::Vacant(v) => {
let line_number = if let Some(loc) = source_map.lookup(id) {
loc.line_num
} else {
0
};
v.insert(line_number);
Ok(())
}
}
}
}
#[derive(PartialEq, Eq, Hash, Debug, Clone, PartialOrd, Ord)]
pub struct FullyQualifiedSymbolName(pub Vec<ScopeSegment>);
impl fmt::Display for FullyQualifiedSymbolName {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let FullyQualifiedSymbolName(v) = self;
for segment in v {
write!(f, "::{}", segment)?;
}
Ok(())
}
}
#[derive(Debug, Clone, Eq, PartialEq, Hash, PartialOrd, Ord)]
pub struct ScopeSegment {
pub name: Rc<String>, //TODO maybe this could be a &str, for efficiency?
}
impl fmt::Display for ScopeSegment {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let kind = ""; //TODO implement some kind of kind-tracking here
write!(f, "{}{}", self.name, kind)
}
}
impl ScopeSegment {
pub fn new(name: Rc<String>) -> ScopeSegment {
ScopeSegment { name }
}
}
//cf. p. 150 or so of Language Implementation Patterns
pub struct SymbolTable {
source_map_handle: SourceMapHandle,
symbol_path_to_symbol: HashMap<FullyQualifiedSymbolName, Symbol>,
id_to_fqsn: HashMap<ItemId, FullyQualifiedSymbolName>,
symbol_trie: SymbolTrie,
}
impl SymbolTable {
pub fn new(source_map_handle: SourceMapHandle) -> SymbolTable {
SymbolTable {
source_map_handle,
symbol_path_to_symbol: HashMap::new(),
id_to_fqsn: HashMap::new(),
symbol_trie: SymbolTrie::new()
}
}
pub fn map_id_to_fqsn(&mut self, id: &ItemId, fqsn: FullyQualifiedSymbolName) {
self.id_to_fqsn.insert(id.clone(), fqsn);
}
pub fn get_fqsn_from_id(&self, id: &ItemId) -> Option<FullyQualifiedSymbolName> {
self.id_to_fqsn.get(&id).cloned()
}
fn add_new_symbol(&mut self, local_name: &Rc<String>, scope_path: &Vec<ScopeSegment>, spec: SymbolSpec) {
let mut vec: Vec<ScopeSegment> = scope_path.clone();
vec.push(ScopeSegment { name: local_name.clone() });
let fully_qualified_name = FullyQualifiedSymbolName(vec);
let symbol = Symbol { local_name: local_name.clone(), fully_qualified_name: fully_qualified_name.clone(), spec };
self.symbol_trie.insert(&fully_qualified_name);
self.symbol_path_to_symbol.insert(fully_qualified_name, symbol);
}
pub fn lookup_by_fqsn(&self, fully_qualified_path: &FullyQualifiedSymbolName) -> Option<&Symbol> {
self.symbol_path_to_symbol.get(fully_qualified_path)
}
pub fn lookup_children_of_fqsn(&self, path: &FullyQualifiedSymbolName) -> Vec<FullyQualifiedSymbolName> {
self.symbol_trie.get_children(path)
}
}
#[derive(Debug)]
pub struct Symbol {
pub local_name: Rc<String>, //TODO does this need to be pub?
fully_qualified_name: FullyQualifiedSymbolName,
pub spec: SymbolSpec,
}
impl fmt::Display for Symbol {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "<Local name: {}, Spec: {}>", self.local_name, self.spec)
}
}
#[derive(Debug)]
pub enum SymbolSpec {
Func(Vec<TypeName>),
DataConstructor {
index: usize,
type_name: TypeName,
type_args: Vec<Rc<String>>,
},
RecordConstructor {
index: usize,
members: HashMap<Rc<String>, TypeName>,
type_name: TypeName,
},
Binding,
Type {
name: TypeName
},
}
impl fmt::Display for SymbolSpec {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::SymbolSpec::*;
match self {
Func(type_names) => write!(f, "Func({:?})", type_names),
DataConstructor { index, type_name, type_args } => write!(f, "DataConstructor(idx: {})({:?} -> {})", index, type_args, type_name),
RecordConstructor { type_name, index, ..} => write!(f, "RecordConstructor(idx: {})(<members> -> {})", index, type_name),
Binding => write!(f, "Binding"),
Type { name } => write!(f, "Type <{}>", name),
}
}
}
impl SymbolTable {
/* note: this adds names for *forward reference* but doesn't actually create any types. solve that problem
* later */
pub fn add_top_level_symbols(&mut self, ast: &ast::AST) -> Result<(), String> {
let mut scope_name_stack = Vec::new();
self.add_symbols_from_scope(&ast.statements, &mut scope_name_stack)
}
fn add_symbols_from_scope<'a>(&'a mut self, statements: &Vec<Statement>, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
use self::ast::Declaration::*;
let mut seen_identifiers = DuplicateNameTrackTable::new();
let mut seen_modules = DuplicateNameTrackTable::new();
for statement in statements.iter() {
match statement {
Statement { kind: StatementKind::Declaration(decl), id } => {
match decl {
FuncSig(ref signature) => {
seen_identifiers.try_register(&signature.name, &id, &self.source_map_handle.borrow())
.map_err(|line| format!("Duplicate function definition: {}. It's already defined at {}", signature.name, line))?;
self.add_function_signature(signature, scope_name_stack)?
}
FuncDecl(ref signature, ref body) => {
seen_identifiers.try_register(&signature.name, &id, &self.source_map_handle.borrow())
.map_err(|line| format!("Duplicate function definition: {}. It's already defined at {}", signature.name, line))?;
self.add_function_signature(signature, scope_name_stack)?;
scope_name_stack.push(ScopeSegment{
name: signature.name.clone(),
});
let output = self.add_symbols_from_scope(body, scope_name_stack);
scope_name_stack.pop();
output?
},
TypeDecl { name, body, mutable } => {
seen_identifiers.try_register(&name.name, &id, &self.source_map_handle.borrow())
.map_err(|line| format!("Duplicate type definition: {}. It's already defined at {}", name.name, line))?;
self.add_type_decl(name, body, mutable, scope_name_stack)?
},
Binding { name, .. } => {
seen_identifiers.try_register(&name, &id, &self.source_map_handle.borrow())
.map_err(|line| format!("Duplicate variable definition: {}. It's already defined at {}", name, line))?;
self.add_new_symbol(name, scope_name_stack, SymbolSpec::Binding);
}
_ => ()
}
},
Statement { kind: StatementKind::Module(ModuleSpecifier { name, contents}), id } => {
seen_modules.try_register(&name, &id, &self.source_map_handle.borrow())
.map_err(|line| format!("Duplicate module definition: {}. It's already defined at {}", name, line))?;
scope_name_stack.push(ScopeSegment { name: name.clone() });
let output = self.add_symbols_from_scope(contents, scope_name_stack);
scope_name_stack.pop();
output?
},
_ => ()
}
}
Ok(())
}
pub fn debug_symbol_table(&self) -> String {
let mut output = format!("Symbol table\n");
let mut sorted_symbols: Vec<(&FullyQualifiedSymbolName, &Symbol)> = self.symbol_path_to_symbol.iter().collect();
sorted_symbols.sort_by(|(fqsn, _), (other_fqsn, _)| fqsn.cmp(other_fqsn));
for (name, sym) in sorted_symbols.iter() {
write!(output, "{} -> {}\n", name, sym).unwrap();
}
output
}
fn add_function_signature(&mut self, signature: &Signature, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
let mut local_type_context = LocalTypeContext::new();
let types = signature.params.iter().map(|param| match param.anno {
Some(ref type_identifier) => Rc::new(format!("{:?}", type_identifier)),
None => local_type_context.new_universal_type()
}).collect();
self.add_new_symbol(&signature.name, scope_name_stack, SymbolSpec::Func(types));
Ok(())
}
//TODO handle type mutability
fn add_type_decl(&mut self, type_name: &TypeSingletonName, body: &TypeBody, _mutable: &bool, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
use crate::ast::{TypeIdentifier, Variant};
let TypeBody(variants) = body;
let ref type_name = type_name.name;
let type_spec = SymbolSpec::Type {
name: type_name.clone(),
};
self.add_new_symbol(type_name, &scope_name_stack, type_spec);
scope_name_stack.push(ScopeSegment{
name: type_name.clone(),
});
//TODO figure out why _params isn't being used here
for (index, var) in variants.iter().enumerate() {
match var {
Variant::UnitStruct(variant_name) => {
let spec = SymbolSpec::DataConstructor {
index,
type_name: type_name.clone(),
type_args: vec![],
};
self.add_new_symbol(variant_name, scope_name_stack, spec);
},
Variant::TupleStruct(variant_name, tuple_members) => {
//TODO fix the notion of a tuple type
let type_args = tuple_members.iter().map(|type_name| match type_name {
TypeIdentifier::Singleton(TypeSingletonName { name, ..}) => name.clone(),
TypeIdentifier::Tuple(_) => unimplemented!(),
}).collect();
let spec = SymbolSpec::DataConstructor {
index,
type_name: type_name.clone(),
type_args
};
self.add_new_symbol(variant_name, scope_name_stack, spec);
},
Variant::Record { name, members: defined_members } => {
let mut members = HashMap::new();
let mut duplicate_member_definitions = Vec::new();
for (member_name, member_type) in defined_members {
match members.entry(member_name.clone()) {
Entry::Occupied(_) => duplicate_member_definitions.push(member_name.clone()),
Entry::Vacant(v) => {
v.insert(match member_type {
TypeIdentifier::Singleton(TypeSingletonName { name, ..}) => name.clone(),
TypeIdentifier::Tuple(_) => unimplemented!(),
});
}
}
}
if duplicate_member_definitions.len() != 0 {
return Err(format!("Duplicate member(s) in definition of type {}: {:?}", type_name, duplicate_member_definitions));
}
let spec = SymbolSpec::RecordConstructor { index, type_name: type_name.clone(), members };
self.add_new_symbol(name, scope_name_stack, spec);
},
}
}
scope_name_stack.pop();
Ok(())
}
}
struct LocalTypeContext {
state: u8
}
impl LocalTypeContext {
fn new() -> LocalTypeContext {
LocalTypeContext { state: 0 }
}
fn new_universal_type(&mut self) -> TypeName {
let n = self.state;
self.state += 1;
Rc::new(format!("{}", (('a' as u8) + n) as char))
}
}

View File

@ -0,0 +1,51 @@
use radix_trie::{Trie, TrieCommon, TrieKey};
use super::FullyQualifiedSymbolName;
use std::hash::{Hasher, Hash};
use std::collections::hash_map::DefaultHasher;
#[derive(Debug)]
pub struct SymbolTrie(Trie<FullyQualifiedSymbolName, ()>);
impl TrieKey for FullyQualifiedSymbolName {
fn encode_bytes(&self) -> Vec<u8> {
let mut hasher = DefaultHasher::new();
let mut output = vec![];
let FullyQualifiedSymbolName(scopes) = self;
for segment in scopes.iter() {
segment.name.as_bytes().hash(&mut hasher);
output.extend_from_slice(&hasher.finish().to_be_bytes());
}
output
}
}
impl SymbolTrie {
pub fn new() -> SymbolTrie {
SymbolTrie(Trie::new())
}
pub fn insert(&mut self, fqsn: &FullyQualifiedSymbolName) {
self.0.insert(fqsn.clone(), ());
}
pub fn get_children(&self, fqsn: &FullyQualifiedSymbolName) -> Vec<FullyQualifiedSymbolName> {
let subtrie = match self.0.subtrie(fqsn) {
Some(s) => s,
None => return vec![]
};
let output: Vec<FullyQualifiedSymbolName> = subtrie.keys().filter(|cur_key| **cur_key != *fqsn).map(|fqsn| fqsn.clone()).collect();
output
}
}
#[test]
fn test_trie_insertion() {
let mut trie = SymbolTrie::new();
trie.insert(&fqsn!("unrelated"; ty, "thing"; tr));
trie.insert(&fqsn!("outer"; ty, "inner"; tr));
trie.insert(&fqsn!("outer"; ty, "inner"; ty, "still_inner"; tr));
let children = trie.get_children(&fqsn!("outer"; ty, "inner"; tr));
assert_eq!(children.len(), 1);
}

View File

@ -0,0 +1,193 @@
#![cfg(test)]
use std::cell::RefCell;
use std::rc::Rc;
use super::*;
use crate::util::quick_ast;
fn add_symbols_from_source(src: &str) -> (SymbolTable, Result<(), String>) {
let (ast, source_map) = quick_ast(src);
let source_map = Rc::new(RefCell::new(source_map));
let mut symbol_table = SymbolTable::new(source_map);
let result = symbol_table.add_top_level_symbols(&ast);
(symbol_table, result)
}
macro_rules! values_in_table {
($source:expr, $single_value:expr) => {
values_in_table!($source => $single_value);
};
($source:expr => $( $value:expr ),* ) => {
{
let (symbol_table, _) = add_symbols_from_source($source);
$(
match symbol_table.lookup_by_fqsn($value) {
Some(_spec) => (),
None => panic!(),
};
)*
}
};
}
#[test]
fn basic_symbol_table() {
values_in_table! { "let a = 10; fn b() { 20 }", &fqsn!("b"; tr) };
values_in_table! { "type Option<T> = Some(T) | None" =>
&fqsn!("Option"; tr),
&fqsn!("Option"; ty, "Some"; tr),
&fqsn!("Option"; ty, "None"; tr) };
}
#[test]
fn no_function_definition_duplicates() {
let source = r#"
fn a() { 1 }
fn b() { 2 }
fn a() { 3 }
"#;
let (_, output) = add_symbols_from_source(source);
assert!(output.unwrap_err().contains("Duplicate function definition: a"))
}
#[test]
fn no_variable_definition_duplicates() {
let source = r#"
let x = 9
let a = 20
let q = 39
let a = 30
"#;
let (_, output) = add_symbols_from_source(source);
let output = output.unwrap_err();
assert!(output.contains("Duplicate variable definition: a"));
assert!(output.contains("already defined at 2"));
}
#[test]
fn no_variable_definition_duplicates_in_function() {
let source = r#"
fn a() {
let a = 20
let b = 40
a + b
}
fn q() {
let a = 29
let x = 30
let x = 33
}
"#;
let (_, output) = add_symbols_from_source(source);
assert!(output.unwrap_err().contains("Duplicate variable definition: x"))
}
#[test]
fn dont_falsely_detect_duplicates() {
let source = r#"
let a = 20;
fn some_func() {
let a = 40;
77
}
let q = 39;
"#;
let (symbol_table, _) = add_symbols_from_source(source);
assert!(symbol_table.lookup_by_fqsn(&fqsn!["a"; tr]).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!["some_func"; fn, "a";tr]).is_some());
}
#[test]
fn enclosing_scopes() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
x + inner_func(x)
}"#;
let (symbol_table, _) = add_symbols_from_source(source);
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "inner_func"; tr)).is_some());
}
#[test]
fn enclosing_scopes_2() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
fn second_inner_func() {
fn another_inner_func() {
}
}
inner_func(x)
}"#;
let (symbol_table, _) = add_symbols_from_source(source);
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "inner_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "second_inner_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "second_inner_func"; fn, "another_inner_func"; tr)).is_some());
}
#[test]
fn enclosing_scopes_3() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
fn second_inner_func() {
fn another_inner_func() {
}
fn another_inner_func() {
}
}
inner_func(x)
}"#;
let (_, output) = add_symbols_from_source(source);
assert!(output.unwrap_err().contains("Duplicate"))
}
#[test]
fn modules() {
let source = r#"
module stuff {
fn item() {
}
}
fn item()
"#;
values_in_table! { source =>
&fqsn!("item"; tr),
&fqsn!("stuff"; tr, "item"; tr)
};
}
#[test]
fn duplicate_modules() {
let source = r#"
module q {
fn foo() { 4 }
}
module a {
fn foo() { 334 }
}
module a {
fn foo() { 256.1 }
}
"#;
let (_, output) = add_symbols_from_source(source);
let output = output.unwrap_err();
assert!(output.contains("Duplicate module"));
assert!(output.contains("already defined at 5"));
}

View File

@ -0,0 +1,344 @@
use itertools::Itertools;
use std::collections::HashMap;
use std::rc::Rc;
use std::iter::{Iterator, Peekable};
use std::fmt;
use crate::source_map::Location;
#[derive(Debug, PartialEq, Clone)]
pub enum TokenKind {
Newline, Semicolon,
LParen, RParen,
LSquareBracket, RSquareBracket,
LAngleBracket, RAngleBracket,
LCurlyBrace, RCurlyBrace,
Pipe, Backslash,
Comma, Period, Colon, Underscore,
Slash, Equals,
Operator(Rc<String>),
DigitGroup(Rc<String>), HexLiteral(Rc<String>), BinNumberSigil,
StrLiteral {
s: Rc<String>,
prefix: Option<Rc<String>>
},
Identifier(Rc<String>),
Keyword(Kw),
EOF,
Error(String),
}
use self::TokenKind::*;
impl fmt::Display for TokenKind {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match self {
&Operator(ref s) => write!(f, "Operator({})", **s),
&DigitGroup(ref s) => write!(f, "DigitGroup({})", s),
&HexLiteral(ref s) => write!(f, "HexLiteral({})", s),
&StrLiteral {ref s, .. } => write!(f, "StrLiteral({})", s),
&Identifier(ref s) => write!(f, "Identifier({})", s),
&Error(ref s) => write!(f, "Error({})", s),
other => write!(f, "{:?}", other),
}
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum Kw {
If, Then, Else,
Is,
Func,
For, While,
Const, Let, In,
Mut,
Return,
Alias, Type, SelfType, SelfIdent,
Interface, Impl,
True, False,
Module, Import
}
lazy_static! {
static ref KEYWORDS: HashMap<&'static str, Kw> =
hashmap! {
"if" => Kw::If,
"then" => Kw::Then,
"else" => Kw::Else,
"is" => Kw::Is,
"fn" => Kw::Func,
"for" => Kw::For,
"while" => Kw::While,
"const" => Kw::Const,
"let" => Kw::Let,
"in" => Kw::In,
"mut" => Kw::Mut,
"return" => Kw::Return,
"alias" => Kw::Alias,
"type" => Kw::Type,
"Self" => Kw::SelfType,
"self" => Kw::SelfIdent,
"interface" => Kw::Interface,
"impl" => Kw::Impl,
"true" => Kw::True,
"false" => Kw::False,
"module" => Kw::Module,
"import" => Kw::Import,
};
}
#[derive(Debug, Clone, PartialEq)]
pub struct Token {
pub kind: TokenKind,
pub location: Location,
}
impl Token {
pub fn get_error(&self) -> Option<String> {
match self.kind {
TokenKind::Error(ref s) => Some(s.clone()),
_ => None,
}
}
pub fn to_string_with_metadata(&self) -> String {
format!("{}({})", self.kind, self.location)
}
pub fn get_kind(&self) -> TokenKind {
self.kind.clone()
}
}
const OPERATOR_CHARS: [char; 18] = ['!', '$', '%', '&', '*', '+', '-', '.', ':', '<', '>', '=', '?', '@', '^', '|', '~', '`'];
fn is_operator(c: &char) -> bool {
OPERATOR_CHARS.iter().any(|x| x == c)
}
type CharData = (usize, usize, char);
pub fn tokenize(input: &str) -> Vec<Token> {
let mut tokens: Vec<Token> = Vec::new();
let mut input = input.lines().enumerate()
.intersperse((0, "\n"))
.flat_map(|(line_idx, ref line)| {
line.chars().enumerate().map(move |(ch_idx, ch)| (line_idx, ch_idx, ch))
})
.peekable();
while let Some((line_num, char_num, c)) = input.next() {
let cur_tok_kind = match c {
'/' => match input.peek().map(|t| t.2) {
Some('/') => {
while let Some((_, _, c)) = input.next() {
if c == '\n' {
break;
}
}
continue;
},
Some('*') => {
input.next();
let mut comment_level = 1;
while let Some((_, _, c)) = input.next() {
if c == '*' && input.peek().map(|t| t.2) == Some('/') {
input.next();
comment_level -= 1;
} else if c == '/' && input.peek().map(|t| t.2) == Some('*') {
input.next();
comment_level += 1;
}
if comment_level == 0 {
break;
}
}
continue;
},
_ => Slash
},
c if c.is_whitespace() && c != '\n' => continue,
'\n' => Newline, ';' => Semicolon,
':' => Colon, ',' => Comma,
'(' => LParen, ')' => RParen,
'{' => LCurlyBrace, '}' => RCurlyBrace,
'[' => LSquareBracket, ']' => RSquareBracket,
'"' => handle_quote(&mut input, None),
'\\' => Backslash,
c if c.is_digit(10) => handle_digit(c, &mut input),
c if c.is_alphabetic() || c == '_' => handle_alphabetic(c, &mut input),
c if is_operator(&c) => handle_operator(c, &mut input),
unknown => Error(format!("Unexpected character: {}", unknown)),
};
let location = Location { line_num, char_num };
tokens.push(Token { kind: cur_tok_kind, location });
}
tokens
}
fn handle_digit(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
if c == '0' && input.peek().map_or(false, |&(_, _, c)| { c == 'x' }) {
input.next();
let rest: String = input.peeking_take_while(|&(_, _, ref c)| c.is_digit(16) || *c == '_').map(|(_, _, c)| { c }).collect();
HexLiteral(Rc::new(rest))
} else if c == '0' && input.peek().map_or(false, |&(_, _, c)| { c == 'b' }) {
input.next();
BinNumberSigil
} else {
let mut buf = c.to_string();
buf.extend(input.peeking_take_while(|&(_, _, ref c)| c.is_digit(10)).map(|(_, _, c)| { c }));
DigitGroup(Rc::new(buf))
}
}
fn handle_quote(input: &mut Peekable<impl Iterator<Item=CharData>>, quote_prefix: Option<&str>) -> TokenKind {
let mut buf = String::new();
loop {
match input.next().map(|(_, _, c)| { c }) {
Some('"') => break,
Some('\\') => {
let next = input.peek().map(|&(_, _, c)| { c });
if next == Some('n') {
input.next();
buf.push('\n')
} else if next == Some('"') {
input.next();
buf.push('"');
} else if next == Some('t') {
input.next();
buf.push('\t');
}
},
Some(c) => buf.push(c),
None => return TokenKind::Error(format!("Unclosed string")),
}
}
TokenKind::StrLiteral { s: Rc::new(buf), prefix: quote_prefix.map(|s| Rc::new(s.to_string())) }
}
fn handle_alphabetic(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
let mut buf = String::new();
buf.push(c);
if c == '_' && input.peek().map(|&(_, _, c)| { !c.is_alphabetic() }).unwrap_or(true) {
return TokenKind::Underscore
}
loop {
match input.peek().map(|&(_, _, c)| { c }) {
Some(c) if c == '"' => {
input.next();
return handle_quote(input, Some(&buf));
},
Some(c) if c.is_alphanumeric() || c == '_' => {
input.next();
buf.push(c);
},
_ => break,
}
}
match KEYWORDS.get(buf.as_str()) {
Some(kw) => TokenKind::Keyword(*kw),
None => TokenKind::Identifier(Rc::new(buf)),
}
}
fn handle_operator(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
match c {
'<' | '>' | '|' | '.' | '=' => {
let ref next = input.peek().map(|&(_, _, c)| { c });
if !next.map(|n| { is_operator(&n) }).unwrap_or(false) {
return match c {
'<' => LAngleBracket,
'>' => RAngleBracket,
'|' => Pipe,
'.' => Period,
'=' => Equals,
_ => unreachable!(),
}
}
},
_ => (),
};
let mut buf = String::new();
if c == '`' {
loop {
match input.peek().map(|&(_, _, c)| { c }) {
Some(c) if c.is_alphabetic() || c == '_' => {
input.next();
buf.push(c);
},
Some('`') => {
input.next();
break;
},
_ => break
}
}
} else {
buf.push(c);
loop {
match input.peek().map(|&(_, _, c)| { c }) {
Some(c) if is_operator(&c) => {
input.next();
buf.push(c);
},
_ => break
}
}
}
TokenKind::Operator(Rc::new(buf))
}
#[cfg(test)]
mod schala_tokenizer_tests {
use super::*;
use super::Kw::*;
macro_rules! digit { ($ident:expr) => { DigitGroup(Rc::new($ident.to_string())) } }
macro_rules! ident { ($ident:expr) => { Identifier(Rc::new($ident.to_string())) } }
macro_rules! op { ($ident:expr) => { Operator(Rc::new($ident.to_string())) } }
#[test]
fn tokens() {
let a = tokenize("let a: A<B> = c ++ d");
let token_kinds: Vec<TokenKind> = a.into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![Keyword(Let), ident!("a"), Colon, ident!("A"),
LAngleBracket, ident!("B"), RAngleBracket, Equals, ident!("c"), op!("++"), ident!("d")]);
}
#[test]
fn underscores() {
let token_kinds: Vec<TokenKind> = tokenize("4_8").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("4"), Underscore, digit!("8")]);
let token_kinds2: Vec<TokenKind> = tokenize("aba_yo").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds2, vec![ident!("aba_yo")]);
}
#[test]
fn comments() {
let token_kinds: Vec<TokenKind> = tokenize("1 + /* hella /* bro */ */ 2").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("1"), op!("+"), digit!("2")]);
}
#[test]
fn backtick_operators() {
let token_kinds: Vec<TokenKind> = tokenize("1 `plus` 2").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("1"), op!("plus"), digit!("2")]);
}
#[test]
fn string_literals() {
let token_kinds: Vec<TokenKind> = tokenize(r#""some string""#).into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![StrLiteral { s: Rc::new("some string".to_string()), prefix: None }]);
let token_kinds: Vec<TokenKind> = tokenize(r#"b"some bytestring""#).into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![StrLiteral { s: Rc::new("some bytestring".to_string()), prefix: Some(Rc::new("b".to_string())) }]);
}
}

View File

@ -0,0 +1,445 @@
use std::collections::HashMap;
use std::rc::Rc;
use parsing::{AST, Statement, Declaration, Signature, Expression, ExpressionType, Operation, Variant, TypeName, TypeSingletonName};
// from Niko's talk
/* fn type_check(expression, expected_ty) -> Ty {
let ty = bare_type_check(expression, expected_type);
if ty icompatible with expected_ty {
try_coerce(expression, ty, expected_ty)
} else {
ty
}
}
fn bare_type_check(exprssion, expected_type) -> Ty { ... }
*/
/* H-M ALGO NOTES
from https://www.youtube.com/watch?v=il3gD7XMdmA
(also check out http://dev.stephendiehl.com/fun/006_hindley_milner.html)
typeInfer :: Expr a -> Matching (Type a)
unify :: Type a -> Type b -> Matching (Type c)
(Matching a) is a monad in which unification is done
ex:
typeInfer (If e1 e2 e3) = do
t1 <- typeInfer e1
t2 <- typeInfer e2
t3 <- typeInfer e3
_ <- unify t1 BoolType
unify t2 t3 -- b/c t2 and t3 have to be the same type
typeInfer (Const (ConstInt _)) = IntType -- same for other literals
--function application
typeInfer (Apply f x) = do
tf <- typeInfer f
tx <- typeInfer x
case tf of
FunctionType t1 t2 -> do
_ <- unify t1 tx
return t2
_ -> fail "Not a function"
--type annotation
typeInfer (Typed x t) = do
tx <- typeInfer x
unify tx t
--variable and let expressions - need to pass around a map of variable names to types here
typeInfer :: [ (Var, Type Var) ] -> Expr Var -> Matching (Type Var)
typeInfer ctx (Var x) = case (lookup x ctx) of
Just t -> return t
Nothing -> fail "Unknown variable"
--let x = e1 in e2
typeInfer ctx (Let x e1 e2) = do
t1 <- typeInfer ctx e1
typeInfer ((x, t1) :: ctx) e2
--lambdas are complicated (this represents ʎx.e)
typeInfer ctx (Lambda x e) = do
t1 <- allocExistentialVariable
t2 <- typeInfer ((x, t1) :: ctx) e
return $ FunctionType t1 t2 -- ie. t1 -> t2
--to solve the problem of map :: (a -> b) -> [a] -> [b]
when we use a variable whose type has universal tvars, convert those universal
tvars to existential ones
-and each distinct universal tvar needs to map to the same existential type
-so we change typeinfer:
typeInfer ctx (Var x) = do
case (lookup x ctx) of
Nothing -> ...
Just t -> do
let uvars = nub (toList t) -- nub removes duplicates, so this gets unique universally quantified variables
evars <- mapM (const allocExistentialVariable) uvars
let varMap = zip uvars evars
let vixVar varMap v = fromJust $ lookup v varMap
return (fmap (fixVar varMap) t)
--how do we define unify??
-recall, type signature is:
unify :: Type a -> Type b -> Matching (Type c)
unify BoolType BoolType = BoolType --easy, same for all constants
unify (FunctionType t1 t2) (FunctionType t3 t4) = do
t5 <- unify t1 t3
t6 <- unify t2 t4
return $ FunctionType t5 t6
unify (TVar a) (TVar b) = if a == b then TVar a else fail
--existential types can be assigned another type at most once
--some complicated stuff about hanlding existential types
--everything else is a type error
unify a b = fail
SKOLEMIZATION - how you prevent an unassigned existential type variable from leaking!
-before a type gets to global scope, replace all unassigned existential vars w/ new unique universal
type variables
*/
#[derive(Debug, PartialEq, Clone)]
pub enum Type {
TVar(TypeVar),
TConst(TypeConst),
TFunc(Box<Type>, Box<Type>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum TypeVar {
Univ(Rc<String>),
Exist(u64),
}
impl TypeVar {
fn univ(label: &str) -> TypeVar {
TypeVar::Univ(Rc::new(label.to_string()))
}
}
#[derive(Debug, PartialEq, Clone)]
pub enum TypeConst {
UserT(Rc<String>),
Integer,
Float,
StringT,
Boolean,
Unit,
Bottom,
}
type TypeCheckResult = Result<Type, String>;
#[derive(Debug, PartialEq, Eq, Hash)]
struct PathSpecifier(Rc<String>);
#[derive(Debug, PartialEq, Clone)]
struct TypeContextEntry {
ty: Type,
constant: bool
}
pub struct TypeContext {
symbol_table: HashMap<PathSpecifier, TypeContextEntry>,
evar_table: HashMap<u64, Type>,
existential_type_label_count: u64
}
impl TypeContext {
pub fn new() -> TypeContext {
TypeContext {
symbol_table: HashMap::new(),
evar_table: HashMap::new(),
existential_type_label_count: 0,
}
}
pub fn add_symbols(&mut self, ast: &AST) {
use self::Declaration::*;
use self::Type::*;
use self::TypeConst::*;
for statement in ast.0.iter() {
match *statement {
Statement::ExpressionStatement(_) => (),
Statement::Declaration(ref decl) => match *decl {
FuncSig(_) => (),
Impl { .. } => (),
TypeDecl(ref type_constructor, ref body) => {
for variant in body.0.iter() {
let (spec, ty) = match variant {
&Variant::UnitStruct(ref data_constructor) => {
let spec = PathSpecifier(data_constructor.clone());
let ty = TConst(UserT(type_constructor.name.clone()));
(spec, ty)
},
&Variant::TupleStruct(ref data_construcor, ref args) => {
//TODO fix
let arg = args.get(0).unwrap();
let type_arg = self.from_anno(arg);
let spec = PathSpecifier(data_construcor.clone());
let ty = TFunc(Box::new(type_arg), Box::new(TConst(UserT(type_constructor.name.clone()))));
(spec, ty)
},
&Variant::Record(_, _) => unimplemented!(),
};
let entry = TypeContextEntry { ty, constant: true };
self.symbol_table.insert(spec, entry);
}
},
TypeAlias { .. } => (),
Binding {ref name, ref constant, ref expr} => {
let spec = PathSpecifier(name.clone());
let ty = expr.1.as_ref()
.map(|ty| self.from_anno(ty))
.unwrap_or_else(|| { self.alloc_existential_type() }); // this call to alloc_existential is OK b/c a binding only ever has one type, so if the annotation is absent, it's fine to just make one de novo
let entry = TypeContextEntry { ty, constant: *constant };
self.symbol_table.insert(spec, entry);
},
FuncDecl(ref signature, _) => {
let spec = PathSpecifier(signature.name.clone());
let ty = self.from_signature(signature);
let entry = TypeContextEntry { ty, constant: true };
self.symbol_table.insert(spec, entry);
},
}
}
}
}
fn lookup(&mut self, binding: &Rc<String>) -> Option<TypeContextEntry> {
let key = PathSpecifier(binding.clone());
self.symbol_table.get(&key).map(|entry| entry.clone())
}
pub fn debug_symbol_table(&self) -> String {
format!("Symbol table:\n {:?}\nEvar table:\n{:?}", self.symbol_table, self.evar_table)
}
fn alloc_existential_type(&mut self) -> Type {
let ret = Type::TVar(TypeVar::Exist(self.existential_type_label_count));
self.existential_type_label_count += 1;
ret
}
fn from_anno(&mut self, anno: &TypeName) -> Type {
use self::Type::*;
use self::TypeConst::*;
match anno {
&TypeName::Singleton(TypeSingletonName { ref name, .. }) => {
match name.as_ref().as_ref() {
"Int" => TConst(Integer),
"Float" => TConst(Float),
"Bool" => TConst(Boolean),
"String" => TConst(StringT),
s => TVar(TypeVar::Univ(Rc::new(format!("{}",s)))),
}
},
&TypeName::Tuple(ref items) => {
if items.len() == 1 {
TConst(Unit)
} else {
TConst(Bottom)
}
}
}
}
fn from_signature(&mut self, sig: &Signature) -> Type {
use self::Type::*;
use self::TypeConst::*;
//TODO this won't work properly until you make sure that all (universal) type vars in the function have the same existential type var
// actually this should never even put existential types into the symbol table at all
//this will crash if more than 5 arg function is used
let names = vec!["a", "b", "c", "d", "e", "f"];
let mut idx = 0;
let mut get_type = || { let q = TVar(TypeVar::Univ(Rc::new(format!("{}", names.get(idx).unwrap())))); idx += 1; q };
let return_type = sig.type_anno.as_ref().map(|anno| self.from_anno(&anno)).unwrap_or_else(|| { get_type() });
if sig.params.len() == 0 {
TFunc(Box::new(TConst(Unit)), Box::new(return_type))
} else {
let mut output_type = return_type;
for p in sig.params.iter() {
let p_type = p.1.as_ref().map(|anno| self.from_anno(anno)).unwrap_or_else(|| { get_type() });
output_type = TFunc(Box::new(p_type), Box::new(output_type));
}
output_type
}
}
pub fn type_check(&mut self, ast: &AST) -> TypeCheckResult {
use self::Type::*;
use self::TypeConst::*;
let mut last = TConst(Unit);
for statement in ast.0.iter() {
match statement {
&Statement::Declaration(ref _decl) => {
//return Err(format!("Declarations not supported"));
},
&Statement::ExpressionStatement(ref expr) => {
last = self.infer(expr)?;
}
}
}
Ok(last)
}
fn infer(&mut self, expr: &Expression) -> TypeCheckResult {
match (&expr.0, &expr.1) {
(exprtype, &Some(ref anno)) => {
let tx = self.infer_no_anno(exprtype)?;
let ty = self.from_anno(anno);
self.unify(tx, ty)
},
(exprtype, &None) => self.infer_no_anno(exprtype),
}
}
fn infer_no_anno(&mut self, ex: &ExpressionType) -> TypeCheckResult {
use self::ExpressionType::*;
use self::Type::*;
use self::TypeConst::*;
Ok(match ex {
&IntLiteral(_) => TConst(Integer),
&FloatLiteral(_) => TConst(Float),
&StringLiteral(_) => TConst(StringT),
&BoolLiteral(_) => TConst(Boolean),
&Value(ref name, _) => {
self.lookup(name)
.map(|entry| entry.ty)
.ok_or(format!("Couldn't find {}", name))?
},
&BinExp(ref op, ref lhs, ref rhs) => {
let t_lhs = self.infer(lhs)?;
match self.infer_op(op)? {
TFunc(t1, t2) => {
let _ = self.unify(t_lhs, *t1)?;
let t_rhs = self.infer(rhs)?;
let x = *t2;
match x {
TFunc(t3, t4) => {
let _ = self.unify(t_rhs, *t3)?;
*t4
},
_ => return Err(format!("Not a function type either")),
}
},
_ => return Err(format!("Op {:?} is not a function type", op)),
}
},
&Call { ref f, ref arguments } => {
let tf = self.infer(f)?;
let targ = self.infer(arguments.get(0).unwrap())?;
match tf {
TFunc(box t1, box t2) => {
let _ = self.unify(t1, targ)?;
t2
},
_ => return Err(format!("Not a function!")),
}
},
_ => TConst(Bottom),
})
}
fn infer_op(&mut self, op: &Operation) -> TypeCheckResult {
use self::Type::*;
use self::TypeConst::*;
macro_rules! binoptype {
($lhs:expr, $rhs:expr, $out:expr) => { TFunc(Box::new($lhs), Box::new(TFunc(Box::new($rhs), Box::new($out)))) };
}
Ok(match (*op.0).as_ref() {
"+" => binoptype!(TConst(Integer), TConst(Integer), TConst(Integer)),
"++" => binoptype!(TConst(StringT), TConst(StringT), TConst(StringT)),
"-" => binoptype!(TConst(Integer), TConst(Integer), TConst(Integer)),
"*" => binoptype!(TConst(Integer), TConst(Integer), TConst(Integer)),
"/" => binoptype!(TConst(Integer), TConst(Integer), TConst(Integer)),
"%" => binoptype!(TConst(Integer), TConst(Integer), TConst(Integer)),
_ => TConst(Bottom)
})
}
fn unify(&mut self, t1: Type, t2: Type) -> TypeCheckResult {
use self::Type::*;
use self::TypeVar::*;
println!("Calling unify with `{:?}` and `{:?}`", t1, t2);
match (&t1, &t2) {
(&TConst(ref c1), &TConst(ref c2)) if c1 == c2 => Ok(TConst(c1.clone())),
(&TFunc(ref t1, ref t2), &TFunc(ref t3, ref t4)) => {
let t5 = self.unify(*t1.clone().clone(), *t3.clone().clone())?;
let t6 = self.unify(*t2.clone().clone(), *t4.clone().clone())?;
Ok(TFunc(Box::new(t5), Box::new(t6)))
},
(&TVar(Univ(ref a)), &TVar(Univ(ref b))) => {
if a == b {
Ok(TVar(Univ(a.clone())))
} else {
Err(format!("Couldn't unify universal types {} and {}", a, b))
}
},
//the interesting case!!
(&TVar(Exist(ref a)), ref t2) => {
let x = self.evar_table.get(a).map(|x| x.clone());
match x {
Some(ref t1) => self.unify(t1.clone().clone(), t2.clone().clone()),
None => {
self.evar_table.insert(*a, t2.clone().clone());
Ok(t2.clone().clone())
}
}
},
(ref t1, &TVar(Exist(ref a))) => {
let x = self.evar_table.get(a).map(|x| x.clone());
match x {
Some(ref t2) => self.unify(t2.clone().clone(), t1.clone().clone()),
None => {
self.evar_table.insert(*a, t1.clone().clone());
Ok(t1.clone().clone())
}
}
},
_ => Err(format!("Types {:?} and {:?} don't unify", t1, t2))
}
}
}
#[cfg(test)]
mod tests {
use super::{Type, TypeVar, TypeConst, TypeContext};
use super::Type::*;
use super::TypeConst::*;
use schala_lang::parsing::{parse, tokenize};
macro_rules! type_test {
($input:expr, $correct:expr) => {
{
let mut tc = TypeContext::new();
let ast = parse(tokenize($input)).0.unwrap() ;
tc.add_symbols(&ast);
assert_eq!($correct, tc.type_check(&ast).unwrap())
}
}
}
#[test]
fn basic_inference() {
type_test!("30", TConst(Integer));
type_test!("fn x(a: Int): Bool {}; x(1)", TConst(Boolean));
}
}

View File

@ -0,0 +1,488 @@
use std::rc::Rc;
use std::fmt::Write;
use ena::unify::{UnifyKey, InPlaceUnificationTable, UnificationTable, EqUnifyValue};
use crate::ast::*;
use crate::util::ScopeStack;
use crate::util::deref_optional_box;
#[derive(Debug, Clone, PartialEq)]
pub struct TypeData {
ty: Option<Type>
}
impl TypeData {
#[allow(dead_code)]
pub fn new() -> TypeData {
TypeData { ty: None }
}
}
pub type TypeName = Rc<String>;
pub struct TypeContext<'a> {
variable_map: ScopeStack<'a, Rc<String>, Type>,
unification_table: InPlaceUnificationTable<TypeVar>,
}
/// `InferResult` is the monad in which type inference takes place.
type InferResult<T> = Result<T, TypeError>;
#[derive(Debug, Clone)]
pub struct TypeError { pub msg: String }
impl TypeError {
fn new<A, T>(msg: T) -> InferResult<A> where T: Into<String> {
Err(TypeError { msg: msg.into() })
}
}
#[allow(dead_code)] // avoids warning from Compound
#[derive(Debug, Clone, PartialEq)]
pub enum Type {
Const(TypeConst),
Var(TypeVar),
Arrow {
params: Vec<Type>,
ret: Box<Type>
},
Compound {
ty_name: String,
args:Vec<Type>
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct TypeVar(usize);
impl UnifyKey for TypeVar {
type Value = Option<TypeConst>;
fn index(&self) -> u32 { self.0 as u32 }
fn from_index(u: u32) -> TypeVar { TypeVar(u as usize) }
fn tag() -> &'static str { "TypeVar" }
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum TypeConst {
Unit,
Nat,
Int,
Float,
StringT,
Bool,
Ordering,
//UserDefined
}
impl TypeConst {
pub fn to_string(&self) -> String {
use self::TypeConst::*;
match self {
Unit => format!("()"),
Nat => format!("Nat"),
Int => format!("Int"),
Float => format!("Float"),
StringT => format!("String"),
Bool => format!("Bool"),
Ordering => format!("Ordering"),
}
}
}
impl EqUnifyValue for TypeConst { }
macro_rules! ty {
($type_name:ident) => { Type::Const(TypeConst::$type_name) };
($t1:ident -> $t2:ident) => { Type::Arrow { params: vec![ty!($t1)], ret: box ty!($t2) } };
($t1:ident -> $t2:ident -> $t3:ident) => { Type::Arrow { params: vec![ty!($t1), ty!($t2)], ret: box ty!($t3) } };
($type_list:ident, $ret_type:ident) => {
Type::Arrow {
params: $type_list,
ret: box $ret_type,
}
}
}
//TODO find a better way to capture the to/from string logic
impl Type {
pub fn to_string(&self) -> String {
use self::Type::*;
match self {
Const(c) => c.to_string(),
Var(v) => format!("t_{}", v.0),
Arrow { params, box ref ret } => {
if params.len() == 0 {
format!("-> {}", ret.to_string())
} else {
let mut buf = String::new();
for p in params.iter() {
write!(buf, "{} -> ", p.to_string()).unwrap();
}
write!(buf, "{}", ret.to_string()).unwrap();
buf
}
},
Compound { .. } => format!("<some compound type>")
}
}
fn from_string(string: &str) -> Option<Type> {
Some(match string {
"()" | "Unit" => ty!(Unit),
"Nat" => ty!(Nat),
"Int" => ty!(Int),
"Float" => ty!(Float),
"String" => ty!(StringT),
"Bool" => ty!(Bool),
"Ordering" => ty!(Ordering),
_ => return None
})
}
}
/*
/// `Type` is parameterized by whether the type variables can be just universal, or universal or
/// existential.
#[derive(Debug, Clone)]
enum Type<A> {
Var(A),
Const(TConst),
Arrow(Box<Type<A>>, Box<Type<A>>),
}
#[derive(Debug, Clone)]
enum TVar {
Univ(UVar),
Exist(ExistentialVar)
}
#[derive(Debug, Clone)]
struct UVar(Rc<String>);
#[derive(Debug, Clone)]
struct ExistentialVar(u32);
impl Type<UVar> {
fn to_tvar(&self) -> Type<TVar> {
match self {
Type::Var(UVar(name)) => Type::Var(TVar::Univ(UVar(name.clone()))),
Type::Const(ref c) => Type::Const(c.clone()),
Type::Arrow(a, b) => Type::Arrow(
Box::new(a.to_tvar()),
Box::new(b.to_tvar())
)
}
}
}
impl Type<TVar> {
fn skolemize(&self) -> Type<UVar> {
match self {
Type::Var(TVar::Univ(uvar)) => Type::Var(uvar.clone()),
Type::Var(TVar::Exist(_)) => Type::Var(UVar(Rc::new(format!("sk")))),
Type::Const(ref c) => Type::Const(c.clone()),
Type::Arrow(a, b) => Type::Arrow(
Box::new(a.skolemize()),
Box::new(b.skolemize())
)
}
}
}
impl TypeIdentifier {
fn to_monotype(&self) -> Type<UVar> {
match self {
TypeIdentifier::Tuple(_) => Type::Const(TConst::Nat),
TypeIdentifier::Singleton(TypeSingletonName { name, .. }) => {
match &name[..] {
"Nat" => Type::Const(TConst::Nat),
"Int" => Type::Const(TConst::Int),
"Float" => Type::Const(TConst::Float),
"Bool" => Type::Const(TConst::Bool),
"String" => Type::Const(TConst::StringT),
_ => Type::Const(TConst::Nat),
}
}
}
}
}
#[derive(Debug, Clone)]
enum TConst {
User(Rc<String>),
Unit,
Nat,
Int,
Float,
StringT,
Bool,
}
impl TConst {
fn user(name: &str) -> TConst {
TConst::User(Rc::new(name.to_string()))
}
}
*/
impl<'a> TypeContext<'a> {
pub fn new() -> TypeContext<'a> {
TypeContext {
variable_map: ScopeStack::new(None),
unification_table: UnificationTable::new(),
}
}
/*
fn new_env(&'a self, new_var: Rc<String>, ty: Type) -> TypeContext<'a> {
let mut new_context = TypeContext {
variable_map: self.variable_map.new_scope(None),
unification_table: UnificationTable::new(), //???? not sure if i want this
};
new_context.variable_map.insert(new_var, ty);
new_context
}
*/
fn get_type_from_name(&self, name: &TypeIdentifier) -> InferResult<Type> {
use self::TypeIdentifier::*;
Ok(match name {
Singleton(TypeSingletonName { name,.. }) => {
match Type::from_string(&name) {
Some(ty) => ty,
None => return TypeError::new(format!("Unknown type name: {}", name))
}
},
Tuple(_) => return TypeError::new("tuples aren't ready yet"),
})
}
/// `typecheck` is the entry into the type-inference system, accepting an AST as an argument
/// Following the example of GHC, the compiler deliberately does typechecking before de-sugaring
/// the AST to ReducedAST
pub fn typecheck(&mut self, ast: &AST) -> Result<Type, TypeError> {
let mut returned_type = Type::Const(TypeConst::Unit);
for statement in ast.statements.iter() {
returned_type = self.statement(statement)?;
}
Ok(returned_type)
}
fn statement(&mut self, statement: &Statement) -> InferResult<Type> {
match &statement.kind {
StatementKind::Expression(e) => self.expr(e),
StatementKind::Declaration(decl) => self.decl(&decl),
StatementKind::Import(_) => Ok(ty!(Unit)),
StatementKind::Module(_) => Ok(ty!(Unit)),
}
}
fn decl(&mut self, decl: &Declaration) -> InferResult<Type> {
use self::Declaration::*;
match decl {
Binding { name, expr, .. } => {
let ty = self.expr(expr)?;
self.variable_map.insert(name.clone(), ty);
},
_ => (),
}
Ok(ty!(Unit))
}
fn invoc(&mut self, invoc: &InvocationArgument) -> InferResult<Type> {
use InvocationArgument::*;
match invoc {
Positional(expr) => self.expr(expr),
_ => Ok(ty!(Nat)) //TODO this is wrong
}
}
fn expr(&mut self, expr: &Expression) -> InferResult<Type> {
match expr {
Expression { kind, type_anno: Some(anno), .. } => {
let t1 = self.expr_type(kind)?;
let t2 = self.get_type_from_name(anno)?;
self.unify(t2, t1)
},
Expression { kind, type_anno: None, .. } => self.expr_type(kind)
}
}
fn expr_type(&mut self, expr: &ExpressionKind) -> InferResult<Type> {
use self::ExpressionKind::*;
Ok(match expr {
NatLiteral(_) => ty!(Nat),
BoolLiteral(_) => ty!(Bool),
FloatLiteral(_) => ty!(Float),
StringLiteral(_) => ty!(StringT),
PrefixExp(op, expr) => self.prefix(op, expr)?,
BinExp(op, lhs, rhs) => self.binexp(op, lhs, rhs)?,
IfExpression { discriminator, body } => self.if_expr(deref_optional_box(discriminator), &**body)?,
Value(val) => self.handle_value(val)?,
Call { box ref f, arguments } => self.call(f, arguments)?,
Lambda { params, type_anno, body } => self.lambda(params, type_anno, body)?,
_ => ty!(Unit),
})
}
fn prefix(&mut self, op: &PrefixOp, expr: &Expression) -> InferResult<Type> {
let tf = match op.builtin.map(|b| b.get_type()) {
Some(ty) => ty,
None => return TypeError::new("no type found")
};
let tx = self.expr(expr)?;
self.handle_apply(tf, vec![tx])
}
fn binexp(&mut self, op: &BinOp, lhs: &Expression, rhs: &Expression) -> InferResult<Type> {
let tf = match op.builtin.map(|b| b.get_type()) {
Some(ty) => ty,
None => return TypeError::new("no type found"),
};
let t_lhs = self.expr(lhs)?;
let t_rhs = self.expr(rhs)?; //TODO is this order a problem? not sure
self.handle_apply(tf, vec![t_lhs, t_rhs])
}
fn if_expr(&mut self, discriminator: Option<&Expression>, body: &IfExpressionBody) -> InferResult<Type> {
use self::IfExpressionBody::*;
match (discriminator, body) {
(Some(expr), SimpleConditional{ then_case, else_case }) => self.handle_simple_if(expr, then_case, else_case),
_ => TypeError::new(format!("Complex conditionals not supported"))
}
}
fn handle_simple_if(&mut self, expr: &Expression, then_clause: &Block, else_clause: &Option<Block>) -> InferResult<Type> {
let t1 = self.expr(expr)?;
let t2 = self.block(then_clause)?;
let t3 = match else_clause {
Some(block) => self.block(block)?,
None => ty!(Unit)
};
let _ = self.unify(ty!(Bool), t1)?;
self.unify(t2, t3)
}
fn lambda(&mut self, params: &Vec<FormalParam>, type_anno: &Option<TypeIdentifier>, _body: &Block) -> InferResult<Type> {
let argument_types: InferResult<Vec<Type>> = params.iter().map(|param: &FormalParam| {
if let FormalParam { anno: Some(type_identifier), .. } = param {
self.get_type_from_name(type_identifier)
} else {
Ok(Type::Var(self.fresh_type_variable()))
}
}).collect();
let argument_types = argument_types?;
let ret_type = match type_anno.as_ref() {
Some(anno) => self.get_type_from_name(anno)?,
None => Type::Var(self.fresh_type_variable())
};
Ok(ty!(argument_types, ret_type))
}
fn call(&mut self, f: &Expression, args: &Vec<InvocationArgument>) -> InferResult<Type> {
let tf = self.expr(f)?;
let arg_types: InferResult<Vec<Type>> = args.iter().map(|ex| self.invoc(ex)).collect();
let arg_types = arg_types?;
self.handle_apply(tf, arg_types)
}
fn handle_apply(&mut self, tf: Type, args: Vec<Type>) -> InferResult<Type> {
Ok(match tf {
Type::Arrow { ref params, ret: box ref t_ret } if params.len() == args.len() => {
for (t_param, t_arg) in params.iter().zip(args.iter()) {
let _ = self.unify(t_param.clone(), t_arg.clone())?; //TODO I think this needs to reference a sub-scope
}
t_ret.clone()
},
Type::Arrow { .. } => return TypeError::new("Wrong length"),
_ => return TypeError::new(format!("Not a function"))
})
}
fn block(&mut self, block: &Block) -> InferResult<Type> {
let mut output = ty!(Unit);
for statement in block.iter() {
output = self.statement(statement)?;
}
Ok(output)
}
fn handle_value(&mut self, val: &QualifiedName) -> InferResult<Type> {
let QualifiedName { components: vec, .. } = val;
let var = &vec[0];
match self.variable_map.lookup(var) {
Some(ty) => Ok(ty.clone()),
None => TypeError::new(format!("Couldn't find variable: {}", &var)),
}
}
fn unify(&mut self, t1: Type, t2: Type) -> InferResult<Type> {
use self::Type::*;
match (t1, t2) {
(Const(ref c1), Const(ref c2)) if c1 == c2 => Ok(Const(c1.clone())), //choice of c1 is arbitrary I *think*
(a @ Var(_), b @ Const(_)) => self.unify(b, a),
(Const(ref c1), Var(ref v2)) => {
self.unification_table.unify_var_value(v2.clone(), Some(c1.clone()))
.or_else(|_| TypeError::new(format!("Couldn't unify {:?} and {:?}", Const(c1.clone()), Var(*v2))))?;
Ok(Const(c1.clone()))
},
(Var(v1), Var(v2)) => {
//TODO add occurs check
self.unification_table.unify_var_var(v1.clone(), v2.clone())
.or_else(|e| {
println!("Unify error: {:?}", e);
TypeError::new(format!("Two type variables {:?} and {:?} couldn't unify", v1, v2))
})?;
Ok(Var(v1.clone())) //arbitrary decision I think
},
(a, b) => TypeError::new(format!("{:?} and {:?} do not unify", a, b)),
}
}
fn fresh_type_variable(&mut self) -> TypeVar {
let new_type_var = self.unification_table.new_key(None);
new_type_var
}
}
#[cfg(test)]
mod typechecking_tests {
use super::*;
macro_rules! assert_type_in_fresh_context {
($string:expr, $type:expr) => {
let mut tc = TypeContext::new();
let (ref ast, _) = crate::util::quick_ast($string);
let ty = tc.typecheck(ast).unwrap();
assert_eq!(ty, $type)
}
}
#[test]
fn basic_test() {
assert_type_in_fresh_context!("1", ty!(Nat));
assert_type_in_fresh_context!(r#""drugs""#, ty!(StringT));
assert_type_in_fresh_context!("true", ty!(Bool));
}
#[test]
fn operators() {
//TODO fix these with new operator regime
/*
assert_type_in_fresh_context!("-1", ty!(Int));
assert_type_in_fresh_context!("1 + 2", ty!(Nat));
assert_type_in_fresh_context!("-2", ty!(Int));
assert_type_in_fresh_context!("!true", ty!(Bool));
*/
}
}

View File

@ -0,0 +1,68 @@
use std::collections::HashMap;
use std::hash::Hash;
use std::cmp::Eq;
use std::ops::Deref;
pub fn deref_optional_box<T>(x: &Option<Box<T>>) -> Option<&T> {
x.as_ref().map(|b: &Box<T>| Deref::deref(b))
}
#[derive(Default, Debug)]
pub struct ScopeStack<'a, T: 'a, V: 'a> where T: Hash + Eq {
parent: Option<&'a ScopeStack<'a, T, V>>,
values: HashMap<T, V>,
scope_name: Option<String>
}
impl<'a, T, V> ScopeStack<'a, T, V> where T: Hash + Eq {
pub fn new(name: Option<String>) -> ScopeStack<'a, T, V> where T: Hash + Eq {
ScopeStack {
parent: None,
values: HashMap::new(),
scope_name: name
}
}
pub fn insert(&mut self, key: T, value: V) where T: Hash + Eq {
self.values.insert(key, value);
}
pub fn lookup(&self, key: &T) -> Option<&V> where T: Hash + Eq {
match (self.values.get(key), self.parent) {
(None, None) => None,
(None, Some(parent)) => parent.lookup(key),
(Some(value), _) => Some(value),
}
}
pub fn new_scope(&'a self, name: Option<String>) -> ScopeStack<'a, T, V> where T: Hash + Eq {
ScopeStack {
parent: Some(self),
values: HashMap::default(),
scope_name: name,
}
}
#[allow(dead_code)]
pub fn get_name(&self) -> Option<&String> {
self.scope_name.as_ref()
}
}
/// this is intended for use in tests, and does no error-handling whatsoever
#[allow(dead_code)]
pub fn quick_ast(input: &str) -> (crate::ast::AST, crate::source_map::SourceMap) {
use std::cell::RefCell;
use std::rc::Rc;
let source_map = crate::source_map::SourceMap::new();
let source_map_handle = Rc::new(RefCell::new(source_map));
let tokens = crate::tokenizing::tokenize(input);
let mut parser = crate::parsing::Parser::new(source_map_handle.clone());
parser.add_new_tokens(tokens);
let output = parser.parse();
std::mem::drop(parser);
(output.unwrap(), Rc::try_unwrap(source_map_handle).map_err(|_| ()).unwrap().into_inner())
}
#[allow(unused_macros)]
macro_rules! rc {
($string:tt) => { Rc::new(stringify!($string).to_string()) }
}

24
schala-repl/Cargo.toml Normal file
View File

@ -0,0 +1,24 @@
[package]
name = "schala-repl"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[dependencies]
llvm-sys = "70.0.2"
take_mut = "0.2.2"
itertools = "0.5.8"
getopts = "0.2.18"
lazy_static = "0.2.8"
maplit = "*"
colored = "1.8"
serde = "1.0.91"
serde_derive = "1.0.91"
serde_json = "1.0.15"
phf = "0.7.12"
includedir = "0.2.0"
linefeed = "0.6.0"
regex = "0.2"
[build-dependencies]
includedir_codegen = "0.2.0"

10
schala-repl/build.rs Normal file
View File

@ -0,0 +1,10 @@
extern crate includedir_codegen;
use includedir_codegen::Compression;
fn main() {
includedir_codegen::start("WEBFILES")
.dir("../static", Compression::Gzip)
.build("static.rs")
.unwrap();
}

View File

@ -0,0 +1,80 @@
use std::time;
use std::collections::HashSet;
pub trait ProgrammingLanguageInterface {
fn get_language_name(&self) -> String;
fn get_source_file_suffix(&self) -> String;
fn run_computation(&mut self, _request: ComputationRequest) -> ComputationResponse {
ComputationResponse {
main_output: Err(format!("Computation pipeline not implemented")),
global_output_stats: GlobalOutputStats::default(),
debug_responses: vec![],
}
}
fn request_meta(&mut self, _request: LangMetaRequest) -> LangMetaResponse {
LangMetaResponse::Custom { kind: format!("not-implemented"), value: format!("") }
}
}
pub struct ComputationRequest<'a> {
pub source: &'a str,
pub debug_requests: HashSet<DebugAsk>,
}
pub struct ComputationResponse {
pub main_output: Result<String, String>,
pub global_output_stats: GlobalOutputStats,
pub debug_responses: Vec<DebugResponse>,
}
#[derive(Default, Debug)]
pub struct GlobalOutputStats {
pub total_duration: time::Duration,
pub stage_durations: Vec<(String, time::Duration)>
}
#[derive(Debug, Clone, Hash, Eq, PartialEq, Deserialize, Serialize)]
pub enum DebugAsk {
Timing,
ByStage { stage_name: String, token: Option<String> },
}
impl DebugAsk {
pub fn is_for_stage(&self, name: &str) -> bool {
match self {
DebugAsk::ByStage { stage_name, .. } if stage_name == name => true,
_ => false
}
}
}
pub struct DebugResponse {
pub ask: DebugAsk,
pub value: String
}
pub enum LangMetaRequest {
StageNames,
Docs {
source: String,
},
Custom {
kind: String,
value: String
},
ImmediateDebug(DebugAsk),
}
pub enum LangMetaResponse {
StageNames(Vec<String>),
Docs {
doc_string: String,
},
Custom {
kind: String,
value: String
},
ImmediateDebug(DebugResponse),
}

92
schala-repl/src/lib.rs Normal file
View File

@ -0,0 +1,92 @@
#![feature(link_args)]
#![feature(slice_patterns, box_patterns, box_syntax, proc_macro_hygiene, decl_macro)]
#![feature(plugin)]
extern crate getopts;
extern crate linefeed;
extern crate itertools;
extern crate colored;
#[macro_use]
extern crate serde_derive;
extern crate serde_json;
extern crate includedir;
extern crate phf;
use std::collections::HashSet;
use std::path::Path;
use std::fs::File;
use std::io::Read;
use std::process::exit;
mod repl;
mod language;
pub use language::{ProgrammingLanguageInterface,
ComputationRequest, ComputationResponse,
LangMetaRequest, LangMetaResponse,
DebugResponse, DebugAsk, GlobalOutputStats};
include!(concat!(env!("OUT_DIR"), "/static.rs"));
const VERSION_STRING: &'static str = "0.1.0";
pub fn start_repl(langs: Vec<Box<dyn ProgrammingLanguageInterface>>) {
let options = command_line_options().parse(std::env::args()).unwrap_or_else(|e| {
println!("{:?}", e);
exit(1);
});
if options.opt_present("help") {
println!("{}", command_line_options().usage("Schala metainterpreter"));
exit(0);
}
match options.free[..] {
[] | [_] => {
let mut repl = repl::Repl::new(langs);
repl.run_repl();
}
[_, ref filename, ..] => {
run_noninteractive(filename, langs);
}
};
}
fn run_noninteractive(filename: &str, languages: Vec<Box<dyn ProgrammingLanguageInterface>>) {
let path = Path::new(filename);
let ext = path.extension().and_then(|e| e.to_str()).unwrap_or_else(|| {
println!("Source file lacks extension");
exit(1);
});
let mut language = Box::new(languages.into_iter().find(|lang| lang.get_source_file_suffix() == ext)
.unwrap_or_else(|| {
println!("Extension .{} not recognized", ext);
exit(1);
}));
let mut source_file = File::open(path).unwrap();
let mut buffer = String::new();
source_file.read_to_string(&mut buffer).unwrap();
let request = ComputationRequest {
source: &buffer,
debug_requests: HashSet::new(),
};
let response = language.run_computation(request);
match response.main_output {
Ok(s) => println!("{}", s),
Err(s) => println!("{}", s)
};
}
fn command_line_options() -> getopts::Options {
let mut options = getopts::Options::new();
options.optflag("h",
"help",
"Show help text");
options.optflag("w",
"webapp",
"Start up web interpreter");
options
}

View File

@ -0,0 +1,99 @@
use super::{Repl, InterpreterDirectiveOutput};
use crate::repl::directive_actions::DirectiveAction;
use colored::*;
/// A CommandTree is either a `Terminal` or a `NonTerminal`. When command parsing reaches the first
/// Terminal, it will use the `DirectiveAction` found there to find an appropriate function to execute,
/// and then execute it with any remaining arguments
#[derive(Clone)]
pub enum CommandTree {
Terminal {
name: String,
children: Vec<CommandTree>,
help_msg: Option<String>,
action: DirectiveAction,
},
NonTerminal {
name: String,
children: Vec<CommandTree>,
help_msg: Option<String>,
action: DirectiveAction,
},
Top(Vec<CommandTree>),
}
impl CommandTree {
pub fn nonterm_no_further_tab_completions(s: &str, help: Option<&str>) -> CommandTree {
CommandTree::NonTerminal {name: s.to_string(), help_msg: help.map(|x| x.to_string()), children: vec![], action: DirectiveAction::Null }
}
pub fn terminal(s: &str, help: Option<&str>, children: Vec<CommandTree>, action: DirectiveAction) -> CommandTree {
CommandTree::Terminal {name: s.to_string(), help_msg: help.map(|x| x.to_string()), children, action}
}
pub fn nonterm(s: &str, help: Option<&str>, children: Vec<CommandTree>) -> CommandTree {
CommandTree::NonTerminal {
name: s.to_string(),
help_msg: help.map(|x| x.to_string()),
children,
action: DirectiveAction::Null
}
}
pub fn get_cmd(&self) -> &str {
match self {
CommandTree::Terminal { name, .. } => name.as_str(),
CommandTree::NonTerminal {name, ..} => name.as_str(),
CommandTree::Top(_) => "",
}
}
pub fn get_help(&self) -> &str {
match self {
CommandTree::Terminal { help_msg, ..} => help_msg.as_ref().map(|s| s.as_str()).unwrap_or("<no help text provided>"),
CommandTree::NonTerminal { help_msg, .. } => help_msg.as_ref().map(|s| s.as_str()).unwrap_or("<no help text provided>"),
CommandTree::Top(_) => ""
}
}
pub fn get_children(&self) -> &Vec<CommandTree> {
use CommandTree::*;
match self {
Terminal { children, .. } |
NonTerminal { children, .. } |
Top(children) => children
}
}
pub fn get_subcommands(&self) -> Vec<&str> {
self.get_children().iter().map(|x| x.get_cmd()).collect()
}
pub fn perform(&self, repl: &mut Repl, arguments: &Vec<&str>) -> InterpreterDirectiveOutput {
let mut dir_pointer: &CommandTree = self;
let mut idx = 0;
let res: Result<(DirectiveAction, usize), String> = loop {
match dir_pointer {
CommandTree::Top(subcommands) | CommandTree::NonTerminal { children: subcommands, .. } => {
let next_command = match arguments.get(idx) {
Some(cmd) => cmd,
None => break Err(format!("Command requires arguments"))
};
idx += 1;
match subcommands.iter().find(|sc| sc.get_cmd() == *next_command) {
Some(command_tree) => {
dir_pointer = command_tree;
},
None => break Err(format!("Command {} not found", next_command))
};
},
CommandTree::Terminal { action, .. } => {
break Ok((action.clone(), idx));
},
}
};
match res {
Ok((action, idx)) => action.perform(repl, &arguments[idx..]),
Err(err) => Some(err.red().to_string())
}
}
}

View File

@ -0,0 +1,133 @@
use super::{Repl, InterpreterDirectiveOutput};
use crate::repl::help::help;
use crate::language::{LangMetaRequest, LangMetaResponse, DebugAsk, DebugResponse};
use itertools::Itertools;
use std::fmt::Write as FmtWrite;
#[derive(Debug, Clone)]
pub enum DirectiveAction {
Null,
Help,
QuitProgram,
ListPasses,
ShowImmediate,
Show,
Hide,
TotalTimeOff,
TotalTimeOn,
StageTimeOff,
StageTimeOn,
Doc,
}
impl DirectiveAction {
pub fn perform(&self, repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
use DirectiveAction::*;
match self {
Null => None,
Help => help(repl, arguments),
QuitProgram => {
repl.save_before_exit();
::std::process::exit(0)
},
ListPasses => {
let language_state = repl.get_cur_language_state();
let pass_names = match language_state.request_meta(LangMetaRequest::StageNames) {
LangMetaResponse::StageNames(names) => names,
_ => vec![],
};
let mut buf = String::new();
for pass in pass_names.iter().map(|name| Some(name)).intersperse(None) {
match pass {
Some(pass) => write!(buf, "{}", pass).unwrap(),
None => write!(buf, " -> ").unwrap(),
}
}
Some(buf)
},
ShowImmediate => {
let cur_state = repl.get_cur_language_state();
let stage_name = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a thing to debug")),
};
let meta = LangMetaRequest::ImmediateDebug(DebugAsk::ByStage { stage_name: stage_name.clone(), token: None });
let meta_response = cur_state.request_meta(meta);
let response = match meta_response {
LangMetaResponse::ImmediateDebug(DebugResponse { ask, value }) => match ask {
DebugAsk::ByStage { stage_name: ref this_stage_name, ..} if *this_stage_name == stage_name => value,
_ => return Some(format!("Wrong debug stage"))
},
_ => return Some(format!("Invalid language meta response")),
};
Some(response)
},
Show => {
let this_stage_name = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a stage to show")),
};
let token = arguments.get(1).map(|s| s.to_string());
repl.options.debug_asks.retain(|ask| match ask {
DebugAsk::ByStage { stage_name, .. } if *stage_name == this_stage_name => false,
_ => true
});
let ask = DebugAsk::ByStage { stage_name: this_stage_name, token };
repl.options.debug_asks.insert(ask);
None
},
Hide => {
let stage_name_to_remove = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a stage to hide")),
};
repl.options.debug_asks.retain(|ask| match ask {
DebugAsk::ByStage { stage_name, .. } if *stage_name == stage_name_to_remove => false,
_ => true
});
None
},
TotalTimeOff => total_time_off(repl, arguments),
TotalTimeOn => total_time_on(repl, arguments),
StageTimeOff => stage_time_off(repl, arguments),
StageTimeOn => stage_time_on(repl, arguments),
Doc => doc(repl, arguments),
}
}
}
fn total_time_on(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_total_time = true;
None
}
fn total_time_off(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_total_time = false;
None
}
fn stage_time_on(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_stage_times = true;
None
}
fn stage_time_off(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_stage_times = false;
None
}
fn doc(repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
arguments.get(0).map(|cmd| {
let source = cmd.to_string();
let meta = LangMetaRequest::Docs { source };
let cur_state = repl.get_cur_language_state();
match cur_state.request_meta(meta) {
LangMetaResponse::Docs { doc_string } => Some(doc_string),
_ => Some(format!("Invalid doc response"))
}
}).unwrap_or(Some(format!(":docs needs an argument")))
}

View File

@ -0,0 +1,55 @@
use crate::repl::command_tree::CommandTree;
use crate::repl::directive_actions::DirectiveAction;
pub fn directives_from_pass_names(pass_names: &Vec<String>) -> CommandTree {
let passes_directives: Vec<CommandTree> = pass_names.iter()
.map(|pass_name| {
if pass_name == "parsing" {
CommandTree::nonterm(pass_name, None, vec![
CommandTree::nonterm_no_further_tab_completions("compact", None),
CommandTree::nonterm_no_further_tab_completions("expanded", None),
CommandTree::nonterm_no_further_tab_completions("trace", None),
])
} else {
CommandTree::nonterm_no_further_tab_completions(pass_name, None)
}
})
.collect();
CommandTree::Top(get_list(&passes_directives, true))
}
fn get_list(passes_directives: &Vec<CommandTree>, include_help: bool) -> Vec<CommandTree> {
use DirectiveAction::*;
vec![
CommandTree::terminal("exit", Some("exit the REPL"), vec![], QuitProgram),
CommandTree::terminal("quit", Some("exit the REPL"), vec![], QuitProgram),
CommandTree::terminal("help", Some("Print this help message"), if include_help { get_list(passes_directives, false) } else { vec![] }, Help),
CommandTree::nonterm("debug",
Some("Configure debug information"),
vec![
CommandTree::terminal("list-passes", Some("List all registered compiler passes"), vec![], ListPasses),
CommandTree::terminal("show-immediate", None, passes_directives.clone(), ShowImmediate),
CommandTree::terminal("show", Some("Show debug output for a specific pass"), passes_directives.clone(), Show),
CommandTree::terminal("hide", Some("Hide debug output for a specific pass"), passes_directives.clone(), Hide),
CommandTree::nonterm("total-time", None, vec![
CommandTree::terminal("on", None, vec![], TotalTimeOn),
CommandTree::terminal("off", None, vec![], TotalTimeOff),
]),
CommandTree::nonterm("stage-times", Some("Computation time per-stage"), vec![
CommandTree::terminal("on", None, vec![], StageTimeOn),
CommandTree::terminal("off", None, vec![], StageTimeOff),
])
]
),
CommandTree::nonterm("lang",
Some("switch between languages, or go directly to a langauge by name"),
vec![
CommandTree::nonterm_no_further_tab_completions("next", None),
CommandTree::nonterm_no_further_tab_completions("prev", None),
CommandTree::nonterm("go", None, vec![]),
]
),
CommandTree::terminal("doc", Some("Get language-specific help for an item"), vec![], Doc),
]
}

View File

@ -0,0 +1,59 @@
use std::fmt::Write as FmtWrite;
use colored::*;
use super::command_tree::CommandTree;
use super::{Repl, InterpreterDirectiveOutput};
pub fn help(repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
match arguments {
[] => return global_help(repl),
commands => {
let dirs = repl.get_directives();
Some(match get_directive_from_commands(commands, &dirs) {
None => format!("Directive `{}` not found", commands.last().unwrap()),
Some(dir) => {
let mut buf = String::new();
let cmd = dir.get_cmd();
let children = dir.get_children();
writeln!(buf, "`{}` - {}", cmd, dir.get_help()).unwrap();
for sub in children.iter() {
writeln!(buf, "\t`{} {}` - {}", cmd, sub.get_cmd(), sub.get_help()).unwrap();
}
buf
}
})
}
}
}
fn get_directive_from_commands<'a>(commands: &[&str], dirs: &'a CommandTree) -> Option<&'a CommandTree> {
let mut directive_list = dirs.get_children();
let mut matched_directive = None;
for cmd in commands {
let found = directive_list.iter().find(|directive| directive.get_cmd() == *cmd);
if let Some(dir) = found {
directive_list = dir.get_children();
}
matched_directive = found;
}
matched_directive
}
fn global_help(repl: &mut Repl) -> InterpreterDirectiveOutput {
let mut buf = String::new();
let sigil = repl.interpreter_directive_sigil;
writeln!(buf, "{} version {}", "Schala REPL".bright_red().bold(), crate::VERSION_STRING).unwrap();
writeln!(buf, "-----------------------").unwrap();
for directive in repl.get_directives().get_children() {
writeln!(buf, "{}{} - {}", sigil, directive.get_cmd(), directive.get_help()).unwrap();
}
let ref lang = repl.get_cur_language_state();
writeln!(buf, "").unwrap();
writeln!(buf, "Language-specific help for {}", lang.get_language_name()).unwrap();
writeln!(buf, "-----------------------").unwrap();
Some(buf)
}

251
schala-repl/src/repl/mod.rs Normal file
View File

@ -0,0 +1,251 @@
use std::sync::Arc;
use std::collections::HashSet;
use crate::language::{ProgrammingLanguageInterface,
ComputationRequest, LangMetaResponse, LangMetaRequest};
mod command_tree;
use self::command_tree::CommandTree;
mod repl_options;
use repl_options::ReplOptions;
mod directive_actions;
mod directives;
use directives::directives_from_pass_names;
mod help;
mod response;
use response::ReplResponse;
const HISTORY_SAVE_FILE: &'static str = ".schala_history";
const OPTIONS_SAVE_FILE: &'static str = ".schala_repl";
type InterpreterDirectiveOutput = Option<String>;
pub struct Repl {
pub interpreter_directive_sigil: char,
line_reader: ::linefeed::interface::Interface<::linefeed::terminal::DefaultTerminal>,
language_states: Vec<Box<dyn ProgrammingLanguageInterface>>,
options: ReplOptions,
}
#[derive(Clone)]
enum PromptStyle {
Normal,
Multiline
}
impl Repl {
pub fn new(initial_states: Vec<Box<dyn ProgrammingLanguageInterface>>) -> Repl {
use linefeed::Interface;
let line_reader = Interface::new("schala-repl").unwrap();
let interpreter_directive_sigil = ':';
Repl {
interpreter_directive_sigil,
line_reader,
language_states: initial_states,
options: ReplOptions::new(),
}
}
pub fn run_repl(&mut self) {
println!("Schala MetaInterpreter version {}", crate::VERSION_STRING);
println!("Type {}help for help with the REPL", self.interpreter_directive_sigil);
self.load_options();
self.handle_repl_loop();
self.save_before_exit();
println!("Exiting...");
}
fn load_options(&mut self) {
self.line_reader.load_history(HISTORY_SAVE_FILE).unwrap_or(());
match ReplOptions::load_from_file(OPTIONS_SAVE_FILE) {
Ok(options) => {
self.options = options;
},
Err(()) => ()
};
}
fn handle_repl_loop(&mut self) {
use linefeed::ReadResult::*;
let sigil = self.interpreter_directive_sigil;
'main: loop {
macro_rules! match_or_break {
($line:expr) => {
match $line {
Err(e) => {
println!("readline IO Error: {}", e);
break 'main;
},
Ok(Eof) | Ok(Signal(_)) => break 'main,
Ok(Input(ref input)) => input,
}
}
}
self.update_line_reader();
let line = self.line_reader.read_line();
let input: &str = match_or_break!(line);
self.line_reader.add_history_unique(input.to_string());
let mut chars = input.chars().peekable();
let repl_responses = match chars.nth(0) {
Some(ch) if ch == sigil => {
if chars.peek() == Some(&'{') {
let mut buf = String::new();
buf.push_str(input.get(2..).unwrap());
'multiline: loop {
self.set_prompt(PromptStyle::Multiline);
let new_line = self.line_reader.read_line();
let new_input = match_or_break!(new_line);
if new_input.starts_with(":}") {
break 'multiline;
} else {
buf.push_str(new_input);
buf.push_str("\n");
}
}
self.handle_input(&buf)
} else {
match self.handle_interpreter_directive(input) {
Some(directive_output) => println!("<> {}", directive_output),
None => (),
}
continue
}
},
_ => self.handle_input(input)
};
for repl_response in repl_responses.iter() {
println!("{}", repl_response);
}
}
}
fn update_line_reader(&mut self) {
let tab_complete_handler = TabCompleteHandler::new(self.interpreter_directive_sigil, self.get_directives());
self.line_reader.set_completer(Arc::new(tab_complete_handler)); //TODO fix this here
self.set_prompt(PromptStyle::Normal);
}
fn set_prompt(&mut self, prompt_style: PromptStyle) {
let prompt_str = match prompt_style {
PromptStyle::Normal => ">> ".to_string(),
PromptStyle::Multiline => ">| ".to_string(),
};
self.line_reader.set_prompt(&prompt_str).unwrap();
}
fn save_before_exit(&self) {
self.line_reader.save_history(HISTORY_SAVE_FILE).unwrap_or(());
self.options.save_to_file(OPTIONS_SAVE_FILE);
}
fn handle_interpreter_directive(&mut self, input: &str) -> InterpreterDirectiveOutput {
let mut iter = input.chars();
iter.next();
let arguments: Vec<&str> = iter
.as_str()
.split_whitespace()
.collect();
if arguments.len() < 1 {
return None;
}
let directives = self.get_directives();
directives.perform(self, &arguments)
}
fn get_cur_language_state(&mut self) -> &mut Box<dyn ProgrammingLanguageInterface> {
//TODO this is obviously not complete
&mut self.language_states[0]
}
fn handle_input(&mut self, input: &str) -> Vec<ReplResponse> {
let mut debug_requests = HashSet::new();
for ask in self.options.debug_asks.iter() {
debug_requests.insert(ask.clone());
}
let request = ComputationRequest { source: input, debug_requests };
let ref mut language_state = self.get_cur_language_state();
let response = language_state.run_computation(request);
response::handle_computation_response(response, &self.options)
}
fn get_directives(&mut self) -> CommandTree {
let language_state = self.get_cur_language_state();
let pass_names = match language_state.request_meta(LangMetaRequest::StageNames) {
LangMetaResponse::StageNames(names) => names,
_ => vec![],
};
directives_from_pass_names(&pass_names)
}
}
struct TabCompleteHandler {
sigil: char,
top_level_commands: CommandTree,
}
use linefeed::complete::{Completion, Completer};
use linefeed::terminal::Terminal;
impl TabCompleteHandler {
fn new(sigil: char, top_level_commands: CommandTree) -> TabCompleteHandler {
TabCompleteHandler {
top_level_commands,
sigil,
}
}
}
impl<T: Terminal> Completer<T> for TabCompleteHandler {
fn complete(&self, word: &str, prompter: &::linefeed::prompter::Prompter<T>, start: usize, _end: usize) -> Option<Vec<Completion>> {
let line = prompter.buffer();
if !line.starts_with(self.sigil) {
return None;
}
let mut words = line[1..(if start == 0 { 1 } else { start })].split_whitespace();
let mut completions = Vec::new();
let mut command_tree: Option<&CommandTree> = Some(&self.top_level_commands);
loop {
match words.next() {
None => {
let top = match command_tree {
Some(CommandTree::Top(_)) => true,
_ => false
};
let word = if top { word.get(1..).unwrap() } else { word };
for cmd in command_tree.map(|x| x.get_subcommands()).unwrap_or(vec![]).into_iter() {
if cmd.starts_with(word) {
completions.push(Completion {
completion: format!("{}{}", if top { ":" } else { "" }, cmd),
display: Some(cmd.to_string()),
suffix: ::linefeed::complete::Suffix::Some(' ')
})
}
}
break;
},
Some(s) => {
let new_ptr: Option<&CommandTree> = command_tree.and_then(|cm| match cm {
CommandTree::Top(children) => children.iter().find(|c| c.get_cmd() == s),
CommandTree::NonTerminal { children, .. } => children.iter().find(|c| c.get_cmd() == s),
CommandTree::Terminal { children, .. } => children.iter().find(|c| c.get_cmd() == s),
});
command_tree = new_ptr;
}
}
}
Some(completions)
}
}

View File

@ -0,0 +1,47 @@
use crate::language::DebugAsk;
use std::io::{Read, Write};
use std::collections::HashSet;
use std::fs::File;
#[derive(Serialize, Deserialize)]
pub struct ReplOptions {
pub debug_asks: HashSet<DebugAsk>,
pub show_total_time: bool,
pub show_stage_times: bool,
}
impl ReplOptions {
pub fn new() -> ReplOptions {
ReplOptions {
debug_asks: HashSet::new(),
show_total_time: true,
show_stage_times: false,
}
}
pub fn save_to_file(&self, filename: &str) {
let res = File::create(filename)
.and_then(|mut file| {
let buf = crate::serde_json::to_string(self).unwrap();
file.write_all(buf.as_bytes())
});
if let Err(err) = res {
println!("Error saving {} file {}", filename, err);
}
}
pub fn load_from_file(filename: &str) -> Result<ReplOptions, ()> {
File::open(filename)
.and_then(|mut file| {
let mut contents = String::new();
file.read_to_string(&mut contents)?;
Ok(contents)
})
.and_then(|contents| {
let output: ReplOptions = crate::serde_json::from_str(&contents)?;
Ok(output)
})
.map_err(|_| ())
}
}

View File

@ -0,0 +1,67 @@
use colored::*;
use std::fmt;
use std::fmt::Write;
use super::ReplOptions;
use crate::language::{ DebugAsk, ComputationResponse};
pub struct ReplResponse {
label: Option<String>,
text: String,
color: Option<Color>
}
impl fmt::Display for ReplResponse {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let mut buf = String::new();
if let Some(ref label) = self.label {
write!(buf, "({})", label).unwrap();
}
write!(buf, "=> {}", self.text).unwrap();
write!(f, "{}", match self.color {
Some(c) => buf.color(c),
None => buf.normal()
})
}
}
pub fn handle_computation_response(response: ComputationResponse, options: &ReplOptions) -> Vec<ReplResponse> {
let mut responses = vec![];
if options.show_total_time {
responses.push(ReplResponse {
label: Some("Total time".to_string()),
text: format!("{:?}", response.global_output_stats.total_duration),
color: None,
});
}
if options.show_stage_times {
responses.push(ReplResponse {
label: Some("Stage times".to_string()),
text: format!("{:?}", response.global_output_stats.stage_durations),
color: None,
});
}
for debug_resp in response.debug_responses {
let stage_name = match debug_resp.ask {
DebugAsk::ByStage { stage_name, .. } => stage_name,
_ => continue,
};
responses.push(ReplResponse {
label: Some(stage_name.to_string()),
text: debug_resp.value,
color: Some(Color::Red),
});
}
responses.push(match response.main_output {
Ok(s) => ReplResponse { label: None, text: s, color: None },
Err(e) => ReplResponse { label: Some("Error".to_string()), text: e, color: Some(Color::Red) },
});
responses
}

View File

@ -0,0 +1,11 @@
fn outer() {
fn inner(a) {
a + 10
}
inner(20) + 8.3
}
outer()

View File

@ -0,0 +1,21 @@
fn hella(a, b) {
a + b
}
fn paha(x, y, z) {
x * y * z
}
a = 1
c = if a {
10
} else {
20
}
q = 4
q = q + 2
q + 1 + c

View File

@ -0,0 +1,8 @@
if 20 {
a = 20
b = 30
c = 40
a + b + c
} else {
Null
}

View File

@ -0,0 +1,5 @@
(fn(q) { q * 2 }(25))
a = fn(x) { x + 5 }
a(2)

17
source_files/main.maaru Normal file
View File

@ -0,0 +1,17 @@
fn add(a, b) {
a + b
}
fn subtract(a, b) {
a - b
}
fn main() {
first_value = add(20, 20)
second_value = subtract(700, 650)
first_value + second_value
}
main()

View File

@ -0,0 +1,24 @@
fn hella(x) {
print("hey")
if x == 3 {
Null
} else {
hella(x + 1)
}
}
hella(0)
fn fib(x) {
if x < 3 {
1
} else {
fib(x - 1) + fib(x - 2)
}
}
fib(10)

View File

@ -0,0 +1,11 @@
let c = 10
fn add(a, b) {
let c = a + b
c
}
let mut b = 20
println(add(1,2))
println(c + b)

View File

@ -0,0 +1,17 @@
fn main() {
let a = 10
let b = 20
a + b
}
//this is a one-line comment
/* this is
a multiline
comment
*/
print(main())

View File

@ -0,0 +1,12 @@
for n <- 1..=100 {
if n % 15 == 0 {
print("FizzBuzz")
} else if n % 5 == 0 {
print("Buzz")
} else if n % 3 == 0 {
print("Fizz")
} else {
print(n.to_string())
}
}

View File

@ -0,0 +1,114 @@
fn main() {
//comments are C-style
/* nested comments /* are cool */ */
}
@annotations are with @-
// variable expressions
var a: I32 = 20
const b: String = 20
there(); can(); be(); multiple(); statements(); per_line();
//string interpolation
const yolo = "I have ${a + b} people in my house"
// let expressions ??? not sure if I want this
let a = 10, b = 20, c = 30 in a + b + c
//list literal
const q = [1,2,3,4]
//lambda literal
q.map({|item| item * 100 })
fn yolo(a: MyType, b: YourType): ReturnType<Param1, Param2> {
if a == 20 {
return "early"
}
var sex = 20
sex
}
/* for/while loop topics */
//infinite loop
while {
if x() { break }
...
}
//conditional loop
while conditionHolds() {
...
}
//iteration over a variable
for i <- [1..1000] {
} //return type is return type of block
//monadic decomposition
for {
a <- maybeInt();
s <- foo()
} return {
a + s
} //return type is Monad<return type of block>
/* end of for loops */
/* conditionals/pattern matching */
// "is" operator for "does this pattern match"
x is Some(t) // type bool
if x {
is Some(t) => {
},
is None => {
}
}
//syntax is, I guess, for <expr> <brace-block>, where <expr> is a bool, or a <arrow-expr>
// type level alises
typealias <name> = <other type> #maybe thsi should be 'alias'?
/*
what if type A = B meant that you could had to create A's with A(B), but when you used A's the interface was exactly like B's?
maybe introduce a 'newtype' keyword for this
*/
//declaring types of all stripes
type MyData = { a: i32, b: String }
type MyType = MyType
type Option<a> = None | Some(a)
type Signal = Absence | SimplePresence(i32) | ComplexPresence {a: i32, b: MyCustomData}
//traits
trait Bashable { }
trait Luggable {
fn lug(self, a: Option<Self>)
}
}
// lambdas
// ruby-style not rust-style
const a: X -> Y -> Z = {|x,y| }

View File

@ -0,0 +1,17 @@
println(sua(4))
fn sua(x): Int {
x + 10
}
//let a = getline()
/*
if a == "true" {
println("You typed true")
} else {
println("You typed something else")
}
*/

12
source_files/test.maaru Normal file
View File

@ -0,0 +1,12 @@
fn a(x) {
x + 20
}
fn x(x) {
x + a(9384)
}
a(0)
x(1)

3
source_files/test.rukka Normal file
View File

@ -0,0 +1,3 @@
(display (+ 1 2))
(display "Hello")

View File

@ -0,0 +1,8 @@
fn めんどくさい(a) {
a + 20
}
print(めんどくさい(394))

7
source_files/while.maaru Normal file
View File

@ -0,0 +1,7 @@
a = 0
while a < 100000
print("hello", a)
a = a + 1
end

View File

@ -1,3 +1,15 @@
extern crate schala_repl;
//extern crate maaru_lang;
//extern crate rukka_lang;
//extern crate robo_lang;
extern crate schala_lang;
use schala_repl::{ProgrammingLanguageInterface, start_repl};
extern { }
fn main() {
println!("Hello, world!");
let langs: Vec<Box<dyn ProgrammingLanguageInterface>> = vec![Box::new(schala_lang::Schala::new())];
start_repl(langs);
}

17
static/index.html Normal file
View File

@ -0,0 +1,17 @@
<!DOCTYPE html>
<html>
<head>
<title>Schala Metainterpreter Web Evaluator</title>
<style>
.CodeArea {
display: flex;
flex-direction: row;
}
</style>
</head>
<body>
<div id="main">
</div>
<script src="bundle.js"></script>
</body>
</html>

64
static/main.jsx Normal file
View File

@ -0,0 +1,64 @@
const React = require("react");
const ReactDOM = require("react-dom");
const superagent = require("superagent");
const serverAddress = "http://localhost:8000";
class CodeArea extends React.Component {
constructor(props) {
super(props);
this.state = {value: "", lastOutput: null};
this.handleChange = this.handleChange.bind(this);
this.submit = this.submit.bind(this);
}
handleChange(event) {
this.setState({value: event.target.value});
}
submit(event) {
console.log("Event", this.state.value);
const source = this.state.value;
superagent.post(`${serverAddress}/input`)
.send({ source })
.set("accept", "json")
.end((error, response) => {
if (response) {
console.log("Resp", response);
this.setState({lastOutput: response.body.text})
} else {
console.error("Error: ", error);
}
});
}
renderOutput() {
if (!this.state.lastOutput) {
return null;
}
return <textarea readOnly value={ this.state.lastOutput } />;
}
render() {
return (<div className="CodeArea">
<div className="input">
<textarea value={ this.state.value } onChange={this.handleChange}>
</textarea>
<button onClick={ this.submit }>Run!</button>
</div>
<div className="output">
{ this.renderOutput() }
</div>
</div>);
}
}
const main = (<div>
<h1>Schala web input</h1>
<p>Write your source code here</p>
<CodeArea/>
</div>);
const rootDom = document.getElementById("main");
ReactDOM.render(main, rootDom);

27
static/package.json Normal file
View File

@ -0,0 +1,27 @@
{
"name": "static",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"babel": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"babel-preset-react": "^6.24.1",
"babelify": "^7.3.0",
"browserify": "^14.4.0",
"react": "^15.6.1",
"react-dom": "^15.6.1",
"superagent": "^3.6.3",
"uglify-js": "^3.1.1"
},
"babel": {
"presets": [
"babel-preset-react",
"babel-preset-es2015"
]
},
"scripts": {
"build": "browserify main.jsx -t babelify -o bundle.js",
"build-minify": "browserify main.jsx -t babelify | uglifyjs > bundle.js"
}
}

1728
static/yarn.lock Normal file

File diff suppressed because it is too large Load Diff