I learn new tools and platforms all the time, but editing needs muscle memory and I don't want to abandon that every five years for the new hotness. IDEA is just starting to show some longevity for Scala and Kotlin.
https://en.wikipedia.org/wiki/Charlie_Ayers won a cookoff and made the Googleplex well known for food. Employees brought their kids for dinner. People from other companies angled for invites to lunch. I once realized I was in line behind Vint Cerf (Vice President of Inventing the Internet). Ayers moved on, but I think Building 43 mid-campus still hosts the enormous "Charlie's Café."
For a while, another building was notorious for serving sushi but only admitted their Android developers, because Andy Rubin was paying for that himself.
The only final classes I can remember are stuff like java.lang.String, which needed to be immutable so a SecurityManager could consume them for policy decisions.
Why AI water use? It looks like Zambia's problem is mostly having water that isn't potable. If it's too dirty for heat exchangers, wouldn't they use closed-loop heat pumps, maybe some exotic working fluid? I even see a few mentions elsewhere of distillation using waste heat (but you probably get sludge to scrub out and dump).
Syntax-aware tools always have issues when a team extends the base language to fit their problem. Rust has macros. People started using "go generate" for stuff like early generics. Does Mergiraf take EBNF or plugins or does a team fork it to explain their syntax?
Yeah at the moment it just supports whatever the tree-sitter parser accepts, period. A bring-your-own-grammar version could be interesting, I don't see why it couldn't work. Do you have any Rust crates to recommend, to do parsing according to a grammar supplied by the user at run time? It's likely to be slower, but maybe not prohibitively so…
Another approach would be for the tool to accept doing structured merging even if there are error nodes in the parsed tree. If those error span the parts of the file where the extended language is used, then the tool could still help with merging the other parts, treating the errors as atomic blocks. I'd be a bit reluctant to do that, because there could be errors for all sorts of other reasons.
Since tree sitter parsers output a c library, you could dynamically load it.
The rust bindings themselves are a thin ffi wrapper.
If you wanted to make it a little smoother than needing to compile the tree sitter syntax you could compile/bundle grammars up with wasm so its sandboxed and cross platform
There's some pressure towards matching, in that there are penalties if they find only "key" or "highly-compensated" employees have a lot matched or choose to contribute a lot to the plan.
reply