81
Part of @r_channels @reddit2telegram Broadcasts messages from the Julia programming language subreddit: https://www.reddit.com/r/Julia/ For discussions on Julia language, join @julialang_org
Binet Function | Complex Fibonacci
https://redd.it/1t4g9vl
@r_Julia
Sublime for Julia?
Hey, I just started learning julia and already know python. I have always preferred Sublime over VScode. Kindly give you guys' recommendations? Thanks
https://redd.it/1t3jjpu
@r_Julia
Need some help regarding variable scopes
M = construct(A,b,c,z0); # Construct the Tableau
feasiblesolutions = zeros(1,size(M,2)-2)
counter = 0;
pivotlocation = 1 size(M,2)-(size(A,1)+1) # initial pivot location, col1 is the row, col2 is the column. NEED IMPROVEMENTS
2 size(M,2)-(size(A,1))
while sum(choose(M)) != 0 # Since choose(M) is setup so that it outputs (0,0) when no more viable entering variables are present
for i = 1:size(A,1) # updates pivotlocation each iteration
if choose(M)[1] == pivotlocationi,1
global pivotlocation[i,2] = choose(M)[2]
end
end
global M = pivot(M, choose(M)) # this condition repeats the pivot operations until there are no more valid pivots
for i = 1:size(A,1)
local feasiblesolutions1,pivot_location[i,2] = Mpivot_location[i,1,end] # updating feasible solutions
end
global counter = counter + 1;
println(pivotlocation);
println(counter, " ", feasiblesolutions);
end
I wrote some code like so (it's terrible), and I would like the vector feasible_solutions to "reset" after every iteration of the while loop, as in it would return to the global assignment of a zero vector before the loop. I think it has something to do with variable scopes but I can't wrap my head around it after reading some documentation.
https://redd.it/1sx9ubr
@r_Julia
[Learning Julia] Short code review for best practice and idiomatic Julia
I'm starting to learn Julia coming from a background in R and a small amount of C++. Is anybody willing to do a short code review so I make sure I'm on the right track with best practice and idiomatic Julia code?
I wrote a [very short module](https://github.com/lawlerem/learning_julia/blob/main/Interval.jl) defining intervals and basic operations such as intersections, unions, and set differences. I don't need a review of actual function logic, just style. One specific thing that I know could be done better is how I'm dealing with EmptyIntervals. My goal is to learn Julia so if any of questions are asking the wrong thing please let me know!
* Is a call to Interval() that returns an EmptyInterval instead of an Interval ok?
* In the definition of e.g. Base.intersect(I::AbstractInterval...) I check if any of the inputs are an EmptyInterval, and if so I return early. I feel like the more idiomatic way to do this is dispatching a different method if there is an EmptyInterval. Is this possible? I have a feeling traits could do this but would be overkill.
* Union, and setdiff are all somewhat type-unstable because they return Vector{Union{Interval, EmptyInterval}}. It seems like having the Union type here is unavoidable due to the nature of the problem. Is this an acceptable place to not have strict type-stability?
I have mostly avoided looking at the pre-existing Intervals.jl because I wanted to struggle through it myself. I did look at how they define empty intervals where they check if the right endpoint is less than the left endpoint. I don't want to use this solution because in my mind empty intervals don't have left or right endpoints. Thanks for your time!
https://redd.it/1stnycb
@r_Julia
Numerical computing in my pocket
https://redd.it/1spm4g6
@r_Julia
Julia, VS Code, and notebook environment recommendations?
Hi, I am coming from using Python + Jupyter Notebook on VS Code.
I've tried Pluto but I did not like that I had to put up a separate localhost browser to run it.
Anyone using Jupyter on VS Code as if you write in Python on Jupyter notebook kernel within VS Code?
If possible, are there any limitations I should be aware of? compared to using Pluto?
I guess I am also open to switching the editor if there is one supporting both Jupyter and Pluto in house.
Thank you!
https://redd.it/1sltbli
@r_Julia
KeemenaPreprocessing.jl v0.1.2 (now with subword tokenization)
I just released KeemenaPreprocessing.jl v0.1.2.
`https://github.com/mantzaris/KeemenaPreprocessing.jl`
It is a Julia package for NLP preprocessing and corpus preparation, and now it has first-party subword support through KeemenaSubwords.jl.
What it can do now:
\- standard text preprocessing and corpus preparation
\- word-level and generic tokenization workflows
\- first-party subword tokenization from the same package entry point
\- tokenizer-native subword ids or bundle-reindexed subword vocabularies
\- streaming preprocessing for larger corpora
\- access to subword offsets, masks, token type ids, and metadata
\- subword preprocessing is not limited to tiny in-memory examples, so it fits better into real corpus preparation workflows via streaming
That means it can now serve as a more complete Julia NLP preprocessing pipeline for modern model workflows.
High-level use cases:
\- preparing text corpora for LLM training
\- subword tokenization from the same package entry point
\- streaming preprocessing for larger datasets
\- preserving tokenizer-native ids or rebuilding a corpus vocabulary
\- exposing offsets, masks, and tokenizer metadata
\- still allowing explicit low-level control through `KeemenaSubwords.jl`
https://redd.it/1sityji
@r_Julia
Calculations in Julia in my wikibook
Hi, I'm having fun posting calculations in Julia in my wikibook: https://en.wikibooks.org/wiki/Scientific_Calculations_with_Julia . What other calculations could I post?
https://redd.it/1s6qar3
@r_Julia
JulIDE - an IDE for the Julia programming language
Hi all!
I'd like to share julIDE — a
modern IDE for Julia built with
Tauri, React, and Rust.
Features:
\- Monaco editor with LSP based on LanguageServer.jl
\- an integrated debugger
\- Revise.jl integration
\- Pluto.jl support
\- Git integration
It's in beta, so bugs expected,
but feedback is very welcome!
GitHub: https://github.com/sinisterMage/JulIde
https://redd.it/1s1b0ev
@r_Julia
Julia native compilation is here?
So how is going on with the native compilation - Are we already there or it's still missed?
https://redd.it/1rysf8j
@r_Julia
BenchmarkTools and JIT Compilation
Hello,
I'm new to Julia, and I'm currently trying to use it to measure an algorithm's performance (along with a few other languages). I want to use @ benchmark from BenchmarkTools and then get the mean and/or median and any other data I want. I was wondering if BenchmarkTools automatically includes a warmup run for JIT compilation? For example, I believe MATLAB's timeit() documentation specifically mentions that first-time compilation costs are taken into account.
I didn't find anything in the BenchmarkTools documentation explicitly mentioning JIT compilation cost and whether @ benchmark automatically does a warmup to exclude the first-time cost, so I was wondering if anyone here knows?
https://redd.it/1rumjol
@r_Julia
I linked with Raylib in Julia; it wasn't that hard!
I wanted to explore how to interop with C, so I tried making some graphical "applications" using the C GUI library Raylib. It was not as hard as I thought it would be, and you can actually do some cool things with it. The coolest thing I managed to do is to run the event loop in a background thread, and the change values of variables that control things like colors and locations interactively in through the REPL.
The code is located on my GitLab: https://gitlab.com/ofsaltandwater-group/ralib-in-julia/
Note that I'm just playing around, and I think there might be some memory safety issues with the code, but it seems to work, at least for me. Note that you must install Raylib to run the code, and that you might have to change the path to Raylib in RayLib.jl.
https://redd.it/1ry1z78
@r_Julia
Function Interpolation
Hey, sorry if this is the wrong place to ask this question. I wanted to ask if Julia has any packages that do function interpolation using cubic splines method and other ones (like linear interpolation, however for the latter I can probably do it manually).
Edit: i found that interpolations.jl does not have all the useful methods that may be needed for certain fields, so, I found another one: Dierckx.jl that is very useful.
https://redd.it/1rq1p9a
@r_Julia
Beginner advice on making a package
Hello there, for all intents and purposes I'm a beginner in programming and in Julia. But I would like to try to build a package with different option pricing formulas/models. Basically to learn more about Julia (and options). Also, as a beginner, I have way too high ambitions, but how can I make this package as robust as posible, in terms of best practices and what pitfalls should I try to avoid when it comes to making it less cumbersom to maintain and change in the future?
https://redd.it/1rprh0l
@r_Julia
Did Parquet2.jl vanish from the package registry?
https://redd.it/1rou3kj
@r_Julia
Gym.jl - Gymnasium RL Environments in Julia
Hello everyone!
I've re-implemented some of the Gymnasium RL environments in Julia. As you might expect, they're much faster than the original ones!
https://i.redd.it/80um01qa1bzg1.gif
If you want to take a look at the code, here's the GitHub repository: https://github.com/scascino4/Gym.jl
I hope you'll find them useful!
https://redd.it/1t4cqks
@r_Julia
Color code change in VSCodium with Julia extension after update?
VSCodium just updated to version 1.116.02821 on my PC, and I noticed the color code of text displayed in *.jl files is different (and more difficult to read for me) as soon as it loaded and showed the notification. Running Julia code seems to work, but I haven’t tested extensively yet. Julia is the only language I use in VSCodium, so unfortunately I’m not sure if this issue is actually specific to Julia or not.
Did the update break something about the Julia language extension? Did they just change the color code? Does anyone else have this issue?
https://redd.it/1sxmz6a
@r_Julia
Frankenstein.jl (help)
https://github.com/Jelterminator/Frankenstein.jl
https://redd.it/1su9q2e
@r_Julia
Size of IOBuffer in characters
Hi all,
I have an IOBuffer with some data, and by construction such data are store as a Vector{UInt8}. Now I would like to infer the length of the data in characters. I know that if all the caracters are ASCIII, I can just take buffer.size and that is the total number of character in my buffer. However, when I have UTF-16 character or some other format, I would be overestimating the actual number of characters. I could convert the data into a String and then take its length, but I would be allocating memory, and if possible I would like to avoid it.
Is there some trick I could use achieve my goal?
https://redd.it/1ssg1eq
@r_Julia
Making a spaceship fluid flow simulation game with Julia called RocketPlumber
https://redd.it/1spdnlz
@r_Julia
Want to learn julia for free
I want to learn julia for free. I like reading text not watching video. I would prefer exercise which test my knowledge and force me to write code. Please recommend me a source for it.
https://redd.it/1sjmo1t
@r_Julia
Julia demo to estimate throughput and power of compute engines on Apple Silicon (CPU, GPU and AMX)
I put together a small Julia demo to run the same matrix multiplication across different compute engines on Apple Silicon:
* CPU (LinearAlgebra)
* GPU (Metal.jl)
* AMX (AppleAccelerate)
What’s nice is that the code barely changes — it’s mostly the array type / backend that determines where it runs.
using LinearAlgebra, BenchmarkTools, Metal
N = 16384
A = rand(Float32, N, N); B = rand(Float32, N, N); C = similar(A)
# GPU
a = MtlArray(A); b = MtlArray(B); c = MtlArray(C)
@benchmark mul!($c, $a, $b)
Then:
# CPU
@benchmark mul!($C, $A, $B)
# AMX
using AppleAccelerate
@benchmark mul!($C, $A, $B)
I also looked at power behavior using mactop + a wall meter — which led to some interesting observations about how efficient the different compute engines are.
Full walkthrough here: [https://youtu.be/HX1B0tlODvY?si=7fZ8HzBG7Ya5LrqS](https://youtu.be/HX1B0tlODvY?si=7fZ8HzBG7Ya5LrqS)
Curious if others have experimented with Metal.jl performance vs CPU/AMX
On my Mac Studio M4 Max (40 GPU cores):
**GPU workload:**
\~183W System DC power (177W delta idle)
\~13 TFLOPS compute throughput
**CPU workload:**
\~122W System DC power.
\~1.3 TFLOPS compute throughput
**AMX workload:**
\~ 39W System DC power
\~ 3.9 TFLOPS compute throughput
https://redd.it/1s6wo6p
@r_Julia
Neovim as a main editor
Greetings, I have been working with Julia for a while and it is been a lot of fun. Even though I mostly used it through, once in a while I need to work with neovim,as my preferred editor, and it has not been great. Has anybody ever successfully setup a ide for Julia, I only need proper lsp for formatting and suggestion. Languageservers.jl seems to be a little slow and inconsistent. Also I need to mention that I normally use nix flakes to set up my environment, not sure it is affecting its performance.
https://redd.it/1s6pa2n
@r_Julia
Timing compatible with Optim?
Hello, I'm optimizing something with Optim.optimize(), and I wanted to diagnose which exact parts are most time-consuming. Annotating with either vanilla @ time or @ timeit from TimerOutputs gives errors, so there seems to be some sort of incompatibility -- I assume something that prevents a gradient from being calculated, like an array mutation. Is there maybe a specific timing package that's made to be compatible with optimization?
https://redd.it/1s4cdw2
@r_Julia
Help with Symbolics.jl power expression simplification
I have an expression
@variables x a
expr = x^a * x
and when I try to simplify it using `Symbolics.simplify`, it returns x^(a)x. The output I expect to get is x^(1+a).
When I do exactly the same, but with `expr = x^a * x^2`, the output is as expected x^(2+a). I tried forcibly using `expr = x^a * x^1`, but it does not help.
I could not find a solution to my problem and I'm too new to write some simplifier on my own. Is there any solution to this problem, or can anyone guide me how to solve this?
https://redd.it/1rtj5ea
@r_Julia
NVIDIA Extends CUDA Tile Abstraction To Julia, Maintaining Python Parity
https://quantumzeitgeist.com/nvidia-cuda-julia-support/
https://redd.it/1rx7aub
@r_Julia
Julia Snail – An Emacs development environment for Julia like Clojure's Cider
https://github.com/gcv/julia-snail
https://redd.it/1rrubo2
@r_Julia
BifurcationKit fails to compute diagram branches
I am an assistant to a class where we rely heavily on BifurcationKit to do a lot of work. I prepared some Notebooks with code examples and sent them to the students. One of them is having issues with the code because she is getting an error that says:
Failed to compute new branch at p = [value]
MethodError: no method matching Float64(::ForwardDiff.Dual ...)
BifurcationKit wanting to find a branch when there was none (fold bifurcation), but that is not the case; it fails even when there are branches to find.Julia versions, and I just remembered that I should ask her to try other versions of the package, but wanted to see if anyone could have some idea. None of the other students are having similar issues and, as far as I know, they all installed everything pretty much at the same time.
Are concurrent writes to BitArrays ever safe?
The documentation states that concurrent writes are not thread safe, but is this always the case. Does anyone know what it is about BitArrays that make them unsafe in this case?
The specifics in my case is that I have a 4 dimensional BitArray, and I want to apply an in place function to each slice in the last dimension (as I understand it this makes the slice contiguous in memory). So roughly I want to do: arr::BitArray{4} = create_array() Threads.@threads for i in size(arr,4)
a_function!(view(arr, :,:,:,i)) end
Is this always unsafe? I feel like since I'm writing to different segments of the array with each task it should be safe, but I might be wrong.
Does anyone know the best practice here?
https://redd.it/1rr0q0s
@r_Julia
Has anyone noticed a slowdown in compilation speeds in 1.10 vs 1.12?
In my automated tests on github I've noticed quite a big slowdown in the compilation times. As part of my test suite, I pull in a decent number of packages to test all the edge cases and supported package extensions. Ever since 1.12.x released, I've noticed it takes way longer to compile & run everything.
Julia 1.10
271 Seconds
https://github.com/OxygenFramework/Oxygen.jl/actions/runs/22383430371/job/65498999769
Julia 1.12.5
682 Seconds
https://github.com/OxygenFramework/Oxygen.jl/actions/runs/22383430371/job/65498999768
https://redd.it/1rjs5ny
@r_Julia