You are viewing archived messages.
Go here to search the history.

Unknown User 2020-07-05 18:58:27

MSG NOT FOUND

Maikel van de Lisdonk 2020-07-06 07:22:52

This sounds like a big step and lots of work this week! When can we start beta testing?😀

Chris Maughan 2020-07-09 08:05:06

Well, this is only part of the puzzle; the Visual stuff is part of the app, and I’m actively working on integrating the two (while realising all the work that is left to do!). My loose deadline is Christmas for shipping ‘something’ 😉.

Ivan Reese 2020-07-09 23:27:56

I'm excited to see what this looks like by Christmas. At the rate you're going, it'll surely be quite rich and polished.

Chris Maughan 2020-07-10 15:45:04

A few full time days to work on it would be good, but thanks for the encouragement 😉 I'll get there eventually; it's a marathon not a sprint....

Robert Butler 2020-07-07 19:12:53

Here is my first 2-minute week video for uCISC, my CISC instruction set designed for homebrew computers. It's a little strange because this is a bit more than a week of work since this is my first one, but hopefully it gets it started and I can post every week now on smaller increments of progress. https://youtu.be/1Z3fTCMWzpc

Kartik Agaram 2020-07-07 19:30:33

FYI you should always feel free to talk about more than just what happened last week. Often we need to give context for what we did, because viewers may not have seen all past videos (or even any past videos).

Ray Imber 2020-07-07 19:30:50

I didn't see a url in your video. Do you have that fancy documentation hosted anywhere yet?

Kartik Agaram 2020-07-07 19:32:00

Robert Butler See?! 🙂

Ray Imber https://github.com/grokthis/ucisc has the docs, and https://github.com/grokthis/ucisc-ruby has the emulated hardware.

Robert Butler 2020-07-07 19:33:20

Noted 🙂 ... thanks for posting the links too.

Ray Imber 2020-07-07 19:47:51

I'm definitely the target audience for this kind of project lol. Kartik Agaram's Mu as well. I have some similar history. I learned the very basics of hardware in college, I have followed Ben Eater off and on since he started, and a few years ago I became really invested in the (now semi-famous?) Nand2Tetris course, which I still love. I just never had the time or energy to take those ideas to the next level and actually do my own project. It's inspiring for me to see other people going for it!

Kartik Agaram 2020-07-07 19:49:17

I'm certainly planning to contribute to it.

Kartik Agaram 2020-07-07 19:51:20

Though the Ruby really gets my goat 😄

Robert Butler 2020-07-07 22:21:35

Ray Imber I'm glad to hear you are in the target audience. If you are that into it, I would love a subscribe over on my YouTube channel. I'm aiming for a video every MWF marching along towards a homebrew computer built on this. https://www.youtube.com/channel/UCh4OpfF7T7UtezGejRTLxCw

Ray Imber 2020-07-08 03:38:20

Robert Butler Your architecture seems very GPU like: A bunch of tiny cores with local memory, and a way to access larger common memory (Also kind of Cray like). You talk about wanting to hit 1080p at 60Hz and supporting modern displays. How are you thinking about physically getting the video signal out? Are you planning on implementing your own HDMI controller? I'm curious because I looked at the HDMI spec once and got very overwhelmed lol.

Robert Butler 2020-07-08 03:50:05

Most likely I'll end up with an HDMI controller chip. I'm not particularly interested in implementing the signal elements of a computer as much as I am how they are exposed to the processor itself.

Robert Butler 2020-07-08 03:53:07

I've also had the thought that this feels a lot like GPU cores, though I'm no expert in them. As far as I know, they tend to be Turing-complete cores these days but I don't know the architecture that much. At this point, I'm aiming for iteratively leveling up the homebrew and seeing where it ends up. My first implementation will be a 4x40 character LCD screen, not even VGA.

Ray Imber 2020-07-08 04:13:31

I feel like HDMI is a huge rabbit hole either way. Even if you get a controller, you are going to end up having to spend precious logic gates to massage it into what ever weird format and timings the controller wants. From what I've seen, VGA is crazy, but tractable at least. HDMI is frustratingly opaque...

I think your idea to iteratively level up is a really good one! push that nonsense off as long as you can 🙂

Robert Butler 2020-07-08 04:16:05

Ray Imber to be clear. I've read the HDMI spec too and ooh boy.... one of the problems with HDMI is I think it tries to do too much and be too much with so many different formats. Anyway, yeah... hopefully my delay tactic will pay off.

Ray Imber 2020-07-08 04:21:04

The Wikipedia on CUDA is actually a pretty good description of modern GPU architecture if you are interested: https://en.wikipedia.org/wiki/Thread_block_(CUDA_programming)

You are purposely ignoring pre-emption and you obviously aren't doing any SIMD, but if you squint, it looks similar. Especially at the Streaming multiprocessor level.

Robert Butler 2020-07-08 04:55:05

Added that to my to-read list for when I'm a bit fresher. Thanks!

nicolas decoster 2020-07-08 07:27:17

I really like the idea of building everything from scratch, from the hardware to the software, to allow the user grasps what is going on. Good luck and have fun with that project!! 🙂

Robert Butler 2020-07-08 13:46:07

Thanks Nicolas!

Steve Peak 2020-07-08 18:08:40

Here is Storyscript’s first 2 minute demo — This demo showcases one of many aspects of our editing experience. Topics: NLP, ML, no-code. Goal: Dialog-based development by understanding intent into a program and understanding ambiguity through holes.

📹 Click here 📹

Quote from Program Synthesis by Microsoft research team automatically finding a program ...that satisfies the user intent expressed in the form of some specification. Since the inception of AI in the 1950s, this problem has been considered the holy grail of Computer Science. ❓ Questions/comments/feedback is all very welcome. Feel free to comment in thread or DM me.

Garth Goldwater 2020-07-08 18:39:40

this is looking so good! very exciting

Robert Butler 2020-07-08 19:33:14

Nice!

Jean-Louis Villecroze 2020-07-08 20:16:44

Cool Steve! 😎

Edward de Jong 2020-07-08 20:25:36

Nice demo, this product should succeed in filling a nice fat niche. Natural language is filled with ambiguity and status signaling, so there is a limit to how far this can go, but for simple things this product is indeed a dream.

Steve Peak 2020-07-08 20:27:06

Thank you Edward de Jong — We have big plans. What you see today is the equivalent to “Google Search” — It’s just the tip of the iceberg.

Chris Knott 2020-07-09 11:38:21

Great demo. The glamour here is the "IDE" and UX, but I think the value is the big library of "pre-chewed" APIs behind the scenes.

I am interested why you didn't choose to have something like string literals for "Who is online?". To me this phrase could be interpreted like the second half of the program - as a command to fetch online users.

How do I write the program where I shout out usernames, and Alexa responds with either "online" or "offline"?

Steve Peak 2020-07-09 13:36:20

Chris Knott Thank you for your comment. It’s very important to keep in mind that it’s not a programming language; we are not constrained or bound to parsing plain-text with a compiler — We can represent the resulting application in many ways. The adding quotations arounds a string is something we will user test after more of the product is complete as we have complete free reign over experimenting with the way the program is represented.

Your question about how to write an alexa skill for “Is {person} online?” would be something along the lines of

when Alexa hears "Is {name} online?" Slack lookup user by name if user is online reply with "Yes" else reply wit "No" Note that I adding string laterals here to demonstrate that we can, at any point. It’s only a css decoration. Note there are no variables; we keep things in scope and the user can reference traits of things in scope without directly linking them. This may sound odd; it’s a novel approach — during the dialog-driven interaction the user will confirm trait relationships therefore not requiring: user = … user.is_active

Steve Peak 2020-07-09 13:39:01

In full transparency, developers tend to find the product confusing because it does not fit into their word very well — they tend to question how things are possible or try to change the design to match their trained knowledge. When you present this to non-developers they have a stronger relationship and understanding to it due to their lack of knowledge of other traditional PLs. But really… how different is that statement to visual programming 🤷 which also changes the paradigm but a lot more where the “language knowledge” is almost entirely unnecessary.

Garth Goldwater 2020-07-09 14:46:33

Note there are no variables; we keep things in scope and the user can reference traits of things in scope without directly linking them. This may sound odd; it’s a novel approach — during the dialog-driven interaction the user will confirm trait relationships therefore not requiring: user = … user.is_active

i’ve said this before and i’ll say it again: this is the coolest part about storyscript to me: an enormous part of the cognitive load of programming is tracing dependencies back up the chain via identifiers. how much of the “compiler in your head” is devoted to a vtable lookup?

Garth Goldwater 2020-07-09 14:47:35

incidentally, IMO: node-and-wire visual programming simply turns this lookup into a visual artifact: a big squiggly line. what if we just eliminated that whole line?

Garth Goldwater 2020-07-09 14:51:08

a lot of the abstraction or functional programming is eliminating mutation by making that line longer—what would be “mutate the object but keep the reference name the same” becomes x'''' over the course of your program

which is why pipelining and currying/point-free feels so nice.

but what if that’s just sublimating the change-over-time issue?

Steve Peak 2020-07-09 15:04:00

Thank you Garth Goldwater! Well stated observations. We are excited to share more concepts of how we manage scope, references, and traits of outputs. It’s novel, no doubt, and it’s our job to convey this in the UX in the most easy way possible.

Steve Peak 2020-07-09 16:45:52

Chris Knott This screenshot shows another theme that is more for our internal engineering purposes. As you can see, the concepts at play are not plain-text but html blocks that are decorated with text and design.

Chris Knott 2020-07-09 16:48:02

Yes, I see. Is this taxonomy (ActivatorStart, Service etc) going to be known to the users or just an implementation detail?

Steve Peak 2020-07-09 16:48:25

No, the taxonomy is internal only

Chris Maughan 2020-07-10 18:30:42

My favourite coffee shop is open again, here in York, so I'm getting back into my routine of an hour or so of progress over morning coffee. That said, I really just integrated the synthesizer and graphics this week. It is nice to have the graphics code up and running again, even if I will be changing it when I get chance. https://youtu.be/mLplFS5WsLg

Chris Maughan 2020-07-10 18:32:59

If it isn't clear, one of my goals of this project is to provide a one-stop-shop for live coding here. There are no external processes. This is a single executable. I want a great 'out of the box' solution to shader and audio coding. The challenge that this brings though is ensuring that all the threads of work inside the single binary are kept busy and performant.

Chris Maughan 2020-07-10 18:34:06

This is probably best watched in HD on YouTube.

Ivan Reese 2020-07-10 18:38:57

I heard a click at around 2:01. Are you still hunting down those issues, or might that have just been a hiccup in the video capture?

Chris Maughan 2020-07-10 18:46:40

Mostly I only hear these when recording these days; I think they are slowly going as I find and fix memory allocations or other stalls on the audio thread; I know of a few places where things aren't properly fixed yet - one in particular to do with the audio processing causing a lock. I found one major cause which was to do with the mixer in the audio path doing the wrong math; that helped a lot with odd sound transitions I was hearing. If you notice, this video was recorded at 500 fps for the graphics, with my video from the webcam, and OBS running on the same machine on a 4K display. To add to that, my machine is 'only' an i7 4 core. I don't mention it in the video, but I also have this app mostly working on Mac. It will be interesting to compare it, since the MBP is a top spec macine. Ultimately, performance work is ongoing and will be more focused as I move towards open sourcing everything.....

Jean-Louis Villecroze 2020-07-10 20:17:27

Wow, this is awesome!

Maikel van de Lisdonk 2020-07-11 05:47:31

This is really, really cool! Totally awesome!!

Mariano Guerra 2020-07-11 09:22:54

really great progress, congrats!

Jack Rusher 2020-07-11 11:20:59

Fun project! :) If you have the time, I'd be interested in reading something some comparisons (in terms of your goals) with Extempore: https://en.wikipedia.org/wiki/Extempore_(software)

Chris Maughan 2020-07-11 12:31:08

Thanks all for the encouraging feedback! Jack Rusher If I'm honest, I'm just following my nose and having fun. This is a side project for me, intended to be released as an OpenSource app, and a single install executable on Mac/Windows/Linux (though perhaps not initially on all platforms). If you're asking what I would like it to be: • A shader development IDE, featuring GL, Vulkan, DX12. I want it to be easy to, say, test a hardware ray tracing shader, or develop a new effect. Currently I support GL, but there are substantial portions of Vulkan/DX12 written. I work in graphics, so there is overlap with things I've done before here. • A music language research and development IDE. Initially supporting something like Ixi lang. I have considered Extempore-like functional scripting too. • A live coding/performance tool. Single Install. Code a performance. Longer term, I'd love to play a bit with VR. I've long been thinking about the idea of building a performance in a 3D space. Everything you see in the tool is rendered using the graphics API; including the text editor (which, though I haven't gone into it too much, is designed to support inline graphics to represent code elements.). That stuff is pie in the sky though. I have a full time job, and this project fits around that most of the time; and that generally means that I bursts of productivity and then rest, depending on work volume....

Chris Maughan 2020-07-11 12:36:23

Regarding Extempore in particular, it's really cool. I'd love to support something similar. My recent experience with the Janet language has encouraged me in that regard. I'm already itching to build some kind of scripting into the tool so I generate some actual music!

Jack Rusher 2020-07-11 12:45:28

I look forward to watching this grow 🙂

Chris Maughan 2020-07-11 12:53:49

Speaking of Extempore ^ 🤯

Chris Maughan 2020-07-11 13:06:07

... and since I've looked at it again today, I had no idea that extempore has a built in synth too, in scheme; that's really cool.