> Putin and Ukraine are in a stalemate. That takes Russia off the table as a near peer to the U.S.
Ukraine, with currently most capable and experienced military in Europe, supported by western countries, is losing. Slowly and while making Russia pay, but losing nonetheless. And if you consider demographics, it kinda lost already. Most people that escaped west won't get back, and many men that were forced to stay will leave soon after they will be allowed to.
For last few decades US victories were even less clear and made against countries like Iraq and Afghanistan.
> NATO would have wiped the floor with the Russian military
Considering my interests and those of my country I would like to believe, but reality do not provide much support for such hopes.
> and it's surprising considering what a juggernaut everyone claimed the Russian military was pre-war.
It is true, but they improved immensely during 3 years of intense conflict (the same for Ukraine). On the other hand NATO has most experience in bombing people in Africa and Middle-East.
War with Russia wouldn't be the same as battle of Timbuktu.
Honest take: yes, it's not ready. When I tried it, the generated binary crashed.
For what it's worth, I am able to generate a non-small (>1 GiB) binary with 1.11 that runs on other people's machines. Not shipping in production, but it could be if you're willing to put in the effort. So in a sense, PackageCompiler.jl is all you need. ;)
Python has useful and rich ecosystem that grows every day. Julia is mostly pile of broken promises (it neither reads as Python, nor it runs as C, at least not without significant effort required to produce curated benchmarks) and desperate hype generators.
Since you have a rosy picture of Python, I assume you're young. Python has been mostly a fringe/toy language for 2 decades, until around ~2010, when a Python fad started not too different from the Rust fad of today, and at some point Google started using it seriously and thought they can fix Python but gave up eventually. The fad lived on and kept evolving and somehow found some popularity with SciPy and then ML. I used it in 90s a little, and I found the language bad for anything other than replacing simple bash scripts or simple desktop applications or a desktop calculator, and I still think it is (but sure, there are people who disagree and think it is a good language). It was slow and didn't have type system, you didn't know whether your code would crash or not until you run that line of code, and the correctness of your program depended on invisible characters.
"Ecosystem" is not a part of the language, and in any case, the Python ecosystem is not written in Python, because Python is not a suitable language for scientific computing, which is unsurprising because that's not what it was designed for.
It is ironic you bring up hype to criticize Julia while praising Python which found popularity thanks to hype rather than technical merit.
What promise are you referring to? Who promised you what? It's a programming language.
> Ecosystem" is not a part of the language, and in any case, the Python ecosystem is not written in Python, because Python is not a suitable language for scientific computing
Doesn't matter. Languages do not matter, ecosystems do, for they determine what is practically achievable.
And it doesn't matter that Python ecosystem relies on huge amounts of C/C++ code. Python people made the effort to wrap this code, document it and maintain those wrappers. Other people use such code through Python APIs. Yes, every language with FFI can do the same. For some reason none achieved that.
Even people using Julia use PythonCall.jl, that's how much Python is unsuitable.
> What promise are you referring to? Who promised you what? It's a programming language.
Acting dumb is poor rhetorical strategy, and ignores such a nice rhetorical advice as principle of charity - it is quite obvious that I didn't mean that programming language made any promise. Making a promise is something that only people can do. And Julia creators and people promoting it made quite bombastic claims throughout the years that turned out to not have much support in reality.
I leave your assumptions about my age or other properties to you.
Ecosystems matter, but runtimes do as well. Take Java, for instance. It didn’t have to wrap C/C++ libraries, yet it became synonymous with anything data-intensive. From Apache Hadoop to Flink, from Kafka to Pulsar. Sure, this is mostly ETL, streaming, and databases rather than numeric or scientific computing, but it shows that a language plus a strong ecosystem can drive a movement.
This is why I see Julia as the Java for technical computing. It’s tackling a domain that’s more numeric and math-heavy, not your average data pipeline, and while it hasn’t yet reached the same breadth as Python, the potential is there. Hopefully, over time, its ecosystem will blossom in the same way.
If what determines the value of a language libraries (which makes no sense to me at all, but let's play your game), then it is one more argument against Python.
You don't need FFI to use a Fortran library from Fortran, and I (and many physicists) have found Fortran better suited to HPC than Python since... the day Python came to existence. And no, many other scripting languages have wrappers, and no, scientific computing is not restricted to ML which the only area Python can be argued to have most wrapper libraries to external code.
Language matters, and two-language problem is a real problem, and you can't make it go away by closing your ears and chanting "doesn't matter! doesn't matter!"
Julia is a real step toward solving this problem, and allows you to interact with libraries/packages in ways that is not possible in Python + Fortran + C/C++ + others. You are free to keep pretending that problem doesn't exist.
You are making disparaging and hyperbolic claims about hyperbolic claims without proper attribution, and when asked for source, you cry foul and sadly try to appear smart by saying "you're acting dumb". You should take on your advice and instead of "acting dumb", explicitly cite what "promises" or "bombastic claims" you are referring to. This is what I asked you to do, but instead of doing it, you are doing what you are doing, which is interesting.
> If what determines the value of a language libraries (which makes no sense to me at all, but let's play your game), then it is one more argument against Python
The fact that you can use those nice numerical and scientific libraries from the language that had also tremendous amount of nice libraries from other domains, wide and good IDE support, is very well documented and has countless tutorials and books available... is an argument against that language? Because you can easily use Fortran code in Fortran?
Nice.
> You don't need FFI to use a Fortran library from Fortran
Wow. Didn't know that.
> And no, many other scripting languages have wrappers,
Always less complete, less documented, with less teaching materials available etc.
But sure, many other languages have wrappers. Julia for example wraps Python API.
> and no, scientific computing is not restricted to ML
Never said it is. I don't do ML, by the way.
> You are making disparaging and hyperbolic claims about hyperbolic claims without proper attribution, and when asked for source, you cry foul
Yeah, yeah. My claims on marketing like "Julia writes like Python, runs like C" are hyperbolic and require explicit citation, even though everyone that had any exposure to this language knows such and similar catch-phrases.
Look, you like Julia, good for you. Have fun with it.
in the early aughts educators loved the shit out of python because "it forced kids to organize their code with indentation". This was about a decade before formatting linters became low key required for languages.
There is quite obvious reason that Israel does what it does - commiting an equivalent of mass child sacrifice is quite good at uniting Israelis even if lot of them protest against that.
If you are hated by everyone outside your tribe, you will stick with your tribe, because you have lost other options.
As always with such threads I can see people commenting that Windows is dead and Linux is all that you need, heck, is straight up better.
Of course HN is a bubble, like every other place like this, but sadly, I would argue that this is the mindset that pretty common among Linux users and holds Linux-native alternatives back. For those saying that Linux is already better than Windows in everything, there is no incentive to work towards actually making it better. When emacs is equally good for those people as VS, when Linux gaming still depending on Windows APIs is considered a great success, when FreeCAD or OpenSCAD in their eyes do not lack anything when compared with professional CAD software etc, then you know you are seeing a bubble that will burst, sooner or later.
I suspect in 2025 even project like FreeCAD would not happen, because today people for some reason believe it is fine to go away from OS that do not respect user agency and their privacy, and use web-based apps that do... exactly the same, but they are not from MS so that's fine I guess. For some reason Windows requiring internet connection is a bad thing, but driving Linux and relying on bunch of web apps that also require internet connection is good.
Celebrating WINE, Proton and Steam OS as victories still baffles me, because the fact that FOSS and Linux world couldn't create real alternatives and had to become good at pretending to be Windows instead is simply a failure.
I guess downvotes come from people that believe vim + grep + printf debugging is peak development environment. Quite amazing that they even go for something such advanced as vim, instead of sticking with ed, for I believe there exists some Linux user claiming that ed doesn't lack anything that VS has.
No you’re just completely ignorant. You can trivially set breakpoints, use conditional breakpoints, watch variables, step over, through, and into in exactly the same way.
Hell, even raw-dogging lldb directly on the CLI is incredibly user friendly, fast, and has a ton features you wish were more exposed by common IDEs. Don’t feel like debugging right now? Take a heap snapshot and do it later! Don’t even need to launch the process.
Visual Studio is ridiculously overrated, and this is coming from someone that works at Microsoft and forced to use it every day. What really kills me are the insanely complicated and unmodifiable shortcut keys for common tasks. Killing the process is like some finger breaking ctrl+alt+function key nonsense? Seriously wtf? Oh to debug multiple binaries simultaneously in the same solution requires launching multiple instances of the entire IDE? Why??
Can you share what the experience is like debugging with gdb directly?
I'm new enough that my first debugger experience was Visual Studio, and I currently use IntelliJ IDEs which provide a similar experience. That experience consisting of: setting breakpoints in the gutter of the text editor, visually stepping through my source files line by line when a breakpoint is hit, with a special-purpose pane in the IDE visible, showing the call stack and the state of all local variables (and able to poke at that state any point higher up in the stack by clicking around the debug pane), able to execute small snippets of code manually to make evaluations/calculations based on the current program state.
I'm not so naive to believe that effective debugging tools didn't exist before GUIs became commonplace, but I have a hard time seeing how anything TUI-based can be anywhere near as information-dense and let you poke around at the running program like I do with my GUI-based IDEs.
(Pasting this comment under a few others because I genuinely want to hear how this works in the real world!)
I much prefer lldb over gdb, but why don’t you just try it and see for yourself?
Of course setting a gutter breakpoint is easier in an IDE, and that’s irrelevant to my point. OP made this aabout vim/emacs versus VisualStudio as if the former doesn’t have gutter-clicking capabilities. Which is ridiculous
The keyboard layout change is not on my version (dogfood) for some reason, maybe because I have to use Remote Desktop and it doesn’t detect a physical keyboard. But fine, I’ll take that back. I even asked AssPilot for help and it was predictably useless.
And cmon modify the registry to debug multiple processes? People work together in teams and share a common tooling that ideally tries to minimize the friction required to get work done. Think about that while contrasting the steps required in that article with the alternative of“launch the app a couple more times, then…”
* "Sometimes, you might need to debug the startup code for an app that is launched by another process. Examples include services and custom setup actions"
Starting multiple copies of the IDE wouldn't handle these scenarios either
Can you share what the experience is like debugging with gdb directly?
I'm new enough that my first debugger experience was Visual Studio, and I currently use IntelliJ IDEs which provide a similar experience. That experience consisting of: setting breakpoints in the gutter of the text editor, visually stepping through my source files line by line when a breakpoint is hit, with a special-purpose pane in the IDE visible, showing the call stack and the state of all local variables (and able to poke at that state any point higher up in the stack by clicking around the debug pane), able to execute small snippets of code manually to make evaluations/calculations based on the current program state.
I'm not so naive to believe that effective debugging tools didn't exist before GUIs became commonplace, but I have a hard time seeing how anything TUI-based can be anywhere near as information-dense and let you poke around at the running program like I do with my GUI-based IDEs.
(Pasting this comment under a few others because I genuinely want to hear how this works in the real world!)
Some emacs-fans really like emacs and will invent any justification for its shortcomings. You are 100% right it has a subpar debugging experience. There were better debuggers 20 years ago than emacs has now.
Stallman himself wrote it so it lies at the intersection of that camp and the lisp cultists (though Ig they are mostly extinct post-LLM), but they used to have a really strong belief that lisp was the path to AI because of it's homoiconicity.
What should be said in it's favor is that due to its architecture it is crazy extensible and hackable. And the fact that the line between configuration and code is very blurry really encourages you to dive into that.
The choice of lisp also helps ensure user freedom as it's a quite simple language - ensuring that compilers and interpreters are a commodity. You don't like one, pick another. Contrast that with say Rust where if you don't like the official Rust you are shit out of luck. It's also a rolling release deal so you can't even easily stay on an old version.
Maybe is personal preference? I like better gdb directly to VC.
I’ve tried to debug with VC, but I felt slow working with it. After several tries, I gave up.
Most of my colleagues never use a debugger even though they use vscode. I (the weird emacs user) actually had to show them how to use one, but they still don't.
Are they actually programmers? Or just people who pretend to know how to code? How can you be a professional programmer and not use a debugger? Also not sure what VS Code has to do with it, it's not Visual Studio proper.
I know plenty of professional programmers (job title states so) that not even they do not use a debugger, many don’t even know how to install/use one or even the very concept of “execute step by step”. Plenty of python users don’t know what pdb is (as matter of fact, have never met one that does know it!). Also plenty of embedded developers writing C or C++ or Java
They go all the time adding hundreds of print(f) of log_* function calls. Often they don’t care to remove them after the fact, as I ask them to, often comes “can/will be useful to detect future bugs”
I’m in the automotive industry, where is known to be a disaster in topic SW. but I think it is also common in other industries. I have seen it in telco already.
While I agree that knowing a debugger is important, and as a leader won’t hire somebody who do not use it, is a fact that many people don’t use it, and are doing ok.
Last but not least, it must be said sometimes you have to go to prints: in fact yesterday I had to, as I was debugging a library with sockets, which would timeout pretty quickly. I used dprintf in gdb, but the advantage to simple prints was not huge.
>>Last but not least, it must be said sometimes you have to go to prints
Well yes, obviously - it's an indespensible tool in any arsenal, I just cannot fathom(as a C++ low level engineer) how someone can be a professional programmer where they are paid for their job and they don't know to use a debugger even to just do a basic pause and step through flow. But then again I don't work with any python programmers, so maybe that's why.
They managed to grow a career out of a minimum set of skills, printf was enough I guess. Also they leverage stupid IT shops where squeaky wheel gets the grease.. being efficient at debugging would almost prove harmful in their world.
Ukraine, with currently most capable and experienced military in Europe, supported by western countries, is losing. Slowly and while making Russia pay, but losing nonetheless. And if you consider demographics, it kinda lost already. Most people that escaped west won't get back, and many men that were forced to stay will leave soon after they will be allowed to.
For last few decades US victories were even less clear and made against countries like Iraq and Afghanistan.
reply