Howard asks: has anyone managed to programmatically connect to the #npu on their Surface Laptop?
Note that most of the examples online of people claiming to do this in fact *fail* to connect and actually use the fallback mechanisms.
#Technology, #CTO, co-founder of @endjin. We help small teams do big things with data, cloud, people, and process. He/him.
I'm interested in #archaeology #food #nffc #theatre #warhammer #politics #physics #compsci #dotnet and just about everything.
Howard asks: has anyone managed to programmatically connect to the #npu on their Surface Laptop?
Note that most of the examples online of people claiming to do this in fact *fail* to connect and actually use the fallback mechanisms.
@sotolf "remembering how to do Floyd-Warshall"
What do you think is the most common mistake when users engage with System.Text.Json? You're probably wrong.
A new blog post
https://blog.json-everything.net/posts/a-common-problem/
Please remember to tip your maintainer.
https://github.com/sponsors/gregsdennis
@poppastring Maybe not financially, but it has clearly damaged his ego; it reflects an unpalatable version of Bezos back at himself. The damage limitation statements are not to shore up WaPo, but to try to limit damage to Bezos's own reputation.
@dazfuller @badcel Thanks, but this isn't a "general case" problem. I'm specifically talking about the dotnet/runtime build (as in mscorlib etc.)
Anyone in #dotnet land know how to persuade the dotnet/runtime build to emit code coverage?
Changing directory to the target test project (in this case System.Text.Json.Tests) and running
dotnet build /t:Test /p:Coverage=true /p:BuildTargetFramework=net10.0
did not seem to work; it spat out the coverage report, but there's no data (and it doesn't even include content at all for the JsonMarshal.* methods that are in the assembly, but not in the STJ namespace)
I asked Stephen Toub to look at the binlog loading perf, and he found a few nice improvements:
Loading time for a huge binlog improved 2:34 -> 2:06, and we can have 1:42 if we convert the viewer to .NET 8
@KirillOsenkov This is another good moment to point out that folk who are still on .NET 4.x for reasons of inertia should seriously consider migrating to .NET 8.x just for the perf wins (especially if you rent compute).
The biggest pain points will be legacy dependencies - especially on things like commercial control libraries. There's usually a supported path, but you may need to hop through a few interim .NET versions to get from where you are to their V.current.
There's a Lego Ideas submission of Clippy... It needs 3800 or so more votes to get a chance of being a Lego set. #microsoftclippy
https://ideas.lego.com/projects/5d1d2c7d-ac10-457b-b0ec-e4740f61db39
Don't forget that most of that aid is in the form of either existing physical equipment or loan guarantees.
That, of course, is a reason it should not be used an excuse for cutting other social provision - it isn't the same sort of spending at all.
But we can't compare the two directly.
Curses. I was going to submit a session on the perf improvements we get from .NET 9 for #dotnet conf, but the CFP closed 10 hours ago!
If anyone would be interested in such a session, maybe I'll organise a short online seminar or something?
Ping me a DM or reply here, or matthew.adams@endjin.com if you think that would be interesting.
We've just shipped Preview 10 of Corvus.JsonSchema, our low-allocation, high performance code generator and validator for #json and #jsonschema on #csharp and #dotnet.
This preview is preparing for the .NET 9.0 wave, and takes a dependency on the RC1 preview System.Text.Json libraries. Don't worry, it still supports netstandard2.0 and net8.0!
If you run on .NET 8 we get a 13% perf improvement on our "typical usage" benchmark.
.NET 9 has a whopping 32% perf uplift.
https://www.nuget.org/packages/Corvus.Json.JsonSchema.TypeGeneratorTool#versions-body-tab
@marquisdegeek and the disappointing thing about that is that it will probably kill a whole lot of great work that isn't really AI e.g. agentic workflows, semantic graphs etc. not to mention a withdrawal of investment in genuinely useful projects that labelled themselves AI to get funded in the first place.
I am concerned for product teams in e.g. Microsoft that will get the blame for the lack of ROI in AI when they were forced into it by senior management (and their products aren't AI at all!)
@marquisdegeek o1 won't burst the bubble because they've already got out the "it's superficially impressive but on extended use you will find it isn't quite so good" message. They've also tacked on "but just you wait for GPT5".
So the shills are out with their "it's amazing! It can count the Rs in Strawberry It can auto generate an app that does a particle tracking animation slightly more reliably than the last one I said was amazing at doing that too!"
GPT5 will presage the AI winter.
📰 Performance Improvements in .NET 9
Take an in-depth tour through hundreds of performance improvements in .NET 9.
https://devblogs.microsoft.com/dotnet/performance-improvements-in-net-9/ #dotnet
@pzmyers In a statement today, Darth Vader has said that "the Emperor has gone a bit too far, and I will be voting for Mon Mothma this time around."
I'm desperately seeking information about this chip.
It is a custom #Psion RS232 IC used in the Series 3c, codenamed CONDOR.
If you were involved in the development of the 3c or knew someone who worked for Psion around 1994-1997 who might have been involved, please get in touch.
We want to emulate this chip in #MAME so that it can be added to the existing Psion SIBO hardware #emulation.
This could also help with 3mx serial emulation.
@merseymal solid work! 💜
As I'm back in Paris this week - was anyone else at the #commodore #amiga DevCon in #Paris in Feb 1990?