Robert Butler 2020-07-07 19:12:53 Here is my first 2-minute week video for uCISC, my CISC instruction set designed for homebrew computers. It's a little strange because this is a bit more than a week of work since this is my first one, but hopefully it gets it started and I can post every week now on smaller increments of progress. https://youtu.be/1Z3fTCMWzpc
Kartik Agaram 2020-07-07 19:30:33 FYI you should always feel free to talk about more than just what happened last week. Often we need to give context for what we did, because viewers may not have seen all past videos (or even any past videos).
Ray Imber 2020-07-07 19:30:50 I didn't see a url in your video. Do you have that fancy documentation hosted anywhere yet?
Ray Imber 2020-07-07 19:47:51 I'm definitely the target audience for this kind of project lol.
Kartik Agaram's Mu as well. I have some similar history. I learned the very basics of hardware in college, I have followed Ben Eater off and on since he started, and a few years ago I became really invested in the (now semi-famous?) Nand2Tetris course, which I still love. I just never had the time or energy to take those ideas to the next level and actually do my own project. It's inspiring for me to see other people going for it!
Ray Imber 2020-07-08 03:38:20 Robert Butler Your architecture seems very GPU like: A bunch of tiny cores with local memory, and a way to access larger common memory (Also kind of Cray like).
You talk about wanting to hit 1080p at 60Hz and supporting modern displays. How are you thinking about physically getting the video signal out? Are you planning on implementing your own HDMI controller?
I'm curious because I looked at the HDMI spec once and got very overwhelmed lol.
Robert Butler 2020-07-08 03:50:05 Most likely I'll end up with an HDMI controller chip. I'm not particularly interested in implementing the signal elements of a computer as much as I am how they are exposed to the processor itself.
Robert Butler 2020-07-08 03:53:07 I've also had the thought that this feels a lot like GPU cores, though I'm no expert in them. As far as I know, they tend to be Turing-complete cores these days but I don't know the architecture that much. At this point, I'm aiming for iteratively leveling up the homebrew and seeing where it ends up. My first implementation will be a 4x40 character LCD screen, not even VGA.
Ray Imber 2020-07-08 04:13:31 I feel like HDMI is a huge rabbit hole either way. Even if you get a controller, you are going to end up having to spend precious logic gates to massage it into what ever weird format and timings the controller wants. From what I've seen, VGA is crazy, but tractable at least. HDMI is frustratingly opaque...
I think your idea to iteratively level up is a really good one! push that nonsense off as long as you can 🙂
Robert Butler 2020-07-08 04:16:05 Ray Imber to be clear. I've read the HDMI spec too and ooh boy.... one of the problems with HDMI is I think it tries to do too much and be too much with so many different formats. Anyway, yeah... hopefully my delay tactic will pay off.
Robert Butler 2020-07-08 04:55:05 Added that to my to-read list for when I'm a bit fresher. Thanks!
nicolas decoster 2020-07-08 07:27:17 I really like the idea of building everything from scratch, from the hardware to the software, to allow the user grasps what is going on. Good luck and have fun with that project!! 🙂
Steve Peak 2020-07-08 18:08:40 Here is Storyscript’s first 2 minute demo — This demo showcases one of many aspects of our editing experience.
Topics: NLP, ML, no-code.
Goal: Dialog-based development by understanding intent into a program and understanding ambiguity through holes.
📹 Click here 📹
Quote from Program Synthesis by Microsoft research team
automatically finding a program ...that satisfies the user intent expressed in the form of some specification. Since the inception of AI in the 1950s,
this problem has been considered the holy grail of Computer Science.
❓ Questions/comments/feedback is all very welcome. Feel free to comment in thread or DM me.
Edward de Jong 2020-07-08 20:25:36 Nice demo, this product should succeed in filling a nice fat niche. Natural language is filled with ambiguity and status signaling, so there is a limit to how far this can go, but for simple things this product is indeed a dream.
Steve Peak 2020-07-08 20:27:06 Thank you Edward de Jong — We have big plans. What you see today is the equivalent to “Google Search” — It’s just the tip of the iceberg.
Chris Knott 2020-07-09 11:38:21 Great demo. The glamour here is the "IDE" and UX, but I think the value is the big library of "pre-chewed" APIs behind the scenes.
I am interested why you didn't choose to have something like string literals for "Who is online?". To me this phrase could be interpreted like the second half of the program - as a command to fetch online users.
How do I write the program where I shout out usernames, and Alexa responds with either "online" or "offline"?
Steve Peak 2020-07-09 13:36:20 Chris Knott Thank you for your comment. It’s very important to keep in mind that it’s not a programming language; we are not constrained or bound to parsing plain-text with a compiler — We can represent the resulting application in many ways. The adding quotations arounds a string is something we will user test after more of the product is complete as we have complete free reign over experimenting with the way the program is represented.
Your question about how to write an alexa skill for “Is {person} online?” would be something along the lines of
when Alexa hears "Is {name} online?"
Slack lookup user by name
if user is online
reply with "Yes"
else
reply wit "No"
Note that I adding string laterals here to demonstrate that we can, at any point. It’s only a css decoration.
Note there are no variables; we keep things in scope and the user can reference traits of things in scope without directly linking them. This may sound odd; it’s a novel approach — during the dialog-driven interaction the user will confirm trait relationships therefore not requiring: user = … user.is_active
Steve Peak 2020-07-09 13:39:01 In full transparency, developers tend to find the product confusing because it does not fit into their word very well — they tend to question how things are possible or try to change the design to match their trained knowledge. When you present this to non-developers they have a stronger relationship and understanding to it due to their lack of knowledge of other traditional PLs. But really… how different is that statement to visual programming 🤷 which also changes the paradigm but a lot more where the “language knowledge” is almost entirely unnecessary.
Garth Goldwater 2020-07-09 14:46:33
Note there are no variables; we keep things in scope and the user can reference traits of things in scope without directly linking them. This may sound odd; it’s a novel approach — during the dialog-driven interaction the user will confirm trait relationships therefore not requiring: user = … user.is_active
i’ve said this before and i’ll say it again: this is the coolest part about storyscript to me: an enormous part of the cognitive load of programming is tracing dependencies back up the chain via identifiers. how much of the “compiler in your head” is devoted to a vtable lookup?
Garth Goldwater 2020-07-09 14:47:35 incidentally, IMO: node-and-wire visual programming simply turns this lookup into a visual artifact: a big squiggly line. what if we just eliminated that whole line?
Garth Goldwater 2020-07-09 14:51:08 a lot of the abstraction or functional programming is eliminating mutation by making that line longer—what would be “mutate the object but keep the reference name the same” becomes x''''
over the course of your program
which is why pipelining and currying/point-free feels so nice.
but what if that’s just sublimating the change-over-time issue?
Steve Peak 2020-07-09 15:04:00 Thank you Garth Goldwater! Well stated observations. We are excited to share more concepts of how we manage scope, references, and traits of outputs. It’s novel, no doubt, and it’s our job to convey this in the UX in the most easy way possible.
Steve Peak 2020-07-09 16:45:52 Chris Knott This screenshot shows another theme that is more for our internal engineering purposes. As you can see, the concepts at play are not plain-text but html blocks that are decorated with text and design.
Chris Knott 2020-07-09 16:48:02 Yes, I see. Is this taxonomy (ActivatorStart, Service etc) going to be known to the users or just an implementation detail?
Chris Maughan 2020-07-10 18:30:42 My favourite coffee shop is open again, here in York, so I'm getting back into my routine of an hour or so of progress over morning coffee. That said, I really just integrated the synthesizer and graphics this week. It is nice to have the graphics code up and running again, even if I will be changing it when I get chance.
https://youtu.be/mLplFS5WsLg
Chris Maughan 2020-07-10 18:32:59 If it isn't clear, one of my goals of this project is to provide a one-stop-shop for live coding here. There are no external processes. This is a single executable. I want a great 'out of the box' solution to shader and audio coding. The challenge that this brings though is ensuring that all the threads of work inside the single binary are kept busy and performant.
Ivan Reese 2020-07-10 18:38:57 I heard a click at around 2:01. Are you still hunting down those issues, or might that have just been a hiccup in the video capture?
Chris Maughan 2020-07-10 18:46:40 Mostly I only hear these when recording these days; I think they are slowly going as I find and fix memory allocations or other stalls on the audio thread; I know of a few places where things aren't properly fixed yet - one in particular to do with the audio processing causing a lock.
I found one major cause which was to do with the mixer in the audio path doing the wrong math; that helped a lot with odd sound transitions I was hearing.
If you notice, this video was recorded at 500 fps for the graphics, with my video from the webcam, and OBS running on the same machine on a 4K display. To add to that, my machine is 'only' an i7 4 core.
I don't mention it in the video, but I also have this app mostly working on Mac. It will be interesting to compare it, since the MBP is a top spec macine.
Ultimately, performance work is ongoing and will be more focused as I move towards open sourcing everything.....
Chris Maughan 2020-07-11 12:31:08 Thanks all for the encouraging feedback!
Jack Rusher If I'm honest, I'm just following my nose and having fun. This is a side project for me, intended to be released as an OpenSource app, and a single install executable on Mac/Windows/Linux (though perhaps not initially on all platforms). If you're asking what I would like it to be:
• A shader development IDE, featuring GL, Vulkan, DX12. I want it to be easy to, say, test a hardware ray tracing shader, or develop a new effect. Currently I support GL, but there are substantial portions of Vulkan/DX12 written. I work in graphics, so there is overlap with things I've done before here.
• A music language research and development IDE. Initially supporting something like Ixi lang. I have considered Extempore-like functional scripting too.
• A live coding/performance tool. Single Install. Code a performance.
Longer term, I'd love to play a bit with VR. I've long been thinking about the idea of building a performance in a 3D space. Everything you see in the tool is rendered using the graphics API; including the text editor (which, though I haven't gone into it too much, is designed to support inline graphics to represent code elements.). That stuff is pie in the sky though. I have a full time job, and this project fits around that most of the time; and that generally means that I bursts of productivity and then rest, depending on work volume....
Chris Maughan 2020-07-11 12:36:23 Regarding Extempore in particular, it's really cool. I'd love to support something similar. My recent experience with the Janet language has encouraged me in that regard. I'm already itching to build some kind of scripting into the tool so I generate some actual music!
Chris Maughan 2020-07-11 13:06:07 ... and since I've looked at it again today, I had no idea that extempore has a built in synth too, in scheme; that's really cool.