Random context generator - RaCoGen (provisional name)
What my project does:
RaCoGen is a simple program that generates a random context (a situation in which then two characters are put) by making use of 3 databases (nouns, adjectives and actions).
1. First, it selects a random noun and adjective, and it generates a setting with that, like "big forest" or "sandy gym".
2. Then, it selects and action, like "talking", "drawing"...
3. Finally, it generates the context using the setting and action, giving a result like "In a sandy gym, where char1 and char2 are drawing."
After all of this is ready, the program prints the result like this:
Random noun selected: beach
Random adjective selected: cultural
Random setting created: cultural beach
Random action selected: sleeping
Random context created: In a cultural beach, where char1 and char2 are sleeping.
Target audience:
This project doesn't have a target audience in mind because it's an experiment. I'm just seeing what I can or can't do. You can consider it a toy, because it's more for entertainment than anything eslse.
But that's just for now. I will, probably, expand this so it gives the users more options, has more variety, etc.
For now, it's made to test while I learn, but maybe in the future it could turn to an app
/r/Python
https://redd.it/1g1e7as
Tkinter based package for sending GUI alerts / notifications, named tk-alert
Hi everyone, I have been thinking to post here for some time and decided to do so. I was hesitant as this is my first time working on a python package and the project is far from being finished.
Long story short, I have been working on a personal app using Tkinter and I needed a way to send error notifications to users, could not find something really easy to install and use so I started working on creating my own self-contained package.
1. What my project does.
Sends GUI notifications for users meant for information, warnings or errors, using Tkinter.
My design philosophy was that the package should be simple and ready to use out-of-the-box, but should have more complex design time features for people that want a specific look on their app (this part is work in progress)
So far I did not have time to continue work on this due to multiple reasons, but as the cold season approaches I am looking forward to get on with some tasks from my to-do list.
2. Target audience.
Tkinter devs, not ready for production yet.
3. Comparison.
What I want this package to be set apart by is the ease of set-up and use +
/r/Python
https://redd.it/1g17jeq
How to connect MySQL database to flask app
Very begginer in flask and MySQL in general and I’ve been having trouble in connecting my database to the Flask app. It’s a very simple login page where the user id and authentication key per user is already inside the database, so the program has to confirm whether or not the inputted user id and authentication key are inaide the database to allow the user to access their dashboars. I’ve mostly been relying on youtube but I can’t seem to find the right one I’m looking for.
If anyone could suggest any references or suggestions that would be very much appreciated.
/r/flask
https://redd.it/1g14umt
Returning dynamic form elements when invalid
I have a potentially 3 level form that can have elements added to it (either forms or formsets).
If any form (or formset) is invalid during the post, I'd like to be able to return the whole form (including dynamic content) to the user for correction before resubmission.
The second level is pretty straight forward as that is just a formset that can be rendered beneath the base form.
However, the third level is where I'm having difficulty as that is a formset belonging to a form of a formset.
The below code snippet shows the issue:
monthly_formset = MonthlyActivityDaysFormset(request.POST, prefix='m')
if monthly_formset.is_valid():
for monthly_form in monthly_formset:
##DO STUFF
if monthly_form.prefix+'-diff_times_per_month_monthly' is not None:
diff_times_formset = DifferentTimesFormset(request.POST, prefix=monthly_form.prefix+'-dt')
if diff_times_formset.is_valid():
for diff_times in diff_times_formset:
/r/djangolearning
https://redd.it/1g10hqo
Deploying (Multiple) Django Apps to a Single Server with Kamal 2
https://www.coryzue.com/writing/kamal-django/
/r/django
https://redd.it/1g0i0nl
Introducing Eventum ASGI, a Python framework simplifying the creation of WebSocket-based apps
# Introduction:
I'm excited to present my first Python framework. I would appreciate any feedback you could give me, it's my first project and it's still in active development.
# What My Project Does:
The project is based on ASGI protocol, the key idea is to simplify the usage of WebSockets which isn't a strong side of most popular frameworks. The framework introduces some new approaches to handling WebSockets, most of the time you'll work with a WSConnection
class which is one of the keystones of the framework.
Another significant difference from the common approach is the connection lifecycle in the app.
1. You create a handshake_route
, which is only responsible for handling the initial request. It expects to get a handshake request to switch protocols.
2. You create an event
. To make it easier to understand you can also consider it to be a route, just for messages sent via an established connection. It expects a JSON
which must contain an "event" field in it.
To explain how everything works behind the scenes:
1. A client sends a handshake to switch protocols and for a server to either accept or reject a connection.
2. Once accepted, the connection gets into a loop where it's constantly checking
/r/Python
https://redd.it/1g0px5z
R nGPT: Normalized Transformer with Representation Learning on the Hypersphere
Paper: https://arxiv.org/pdf/2410.01131
Abstract:
>We propose a novel neural network architecture, the normalized Transformer (nGPT) with representation learning on the hypersphere. In nGPT, all vectors forming the embeddings, MLP, attention matrices and hidden states are unit norm normalized. The input stream of tokens travels on the surface of a hypersphere, with each layer contributing a displacement towards the target output predictions. These displacements are defined by the MLP and attention blocks, whose vector components also reside on the same hypersphere. Experiments show that nGPT learns much faster, reducing the number of training steps required to achieve the same accuracy by a factor of 4 to 20, depending on the sequence length.
Highlights:
>Our key contributions are as follows:
Optimization of network parameters on the hypersphere We propose to normalize all vectors forming the embedding dimensions of network matrices to lie on a unit norm hypersphere. This allows us to view matrix-vector multiplications as dot products representing cosine similarities bounded in [-1,1\]. The normalization renders weight decay unnecessary.
Normalized Transformer as a variable-metric optimizer on the hypersphere The normalized Transformer itself performs a multi-step optimization (two steps per layer) on a hypersphere, where each step of the attention
/r/MachineLearning
https://redd.it/1g0lnij
PEP 735 Dependency Groups is accepted
https://peps.python.org/pep-0735/
https://discuss.python.org/t/pep-735-dependency-groups-in-pyproject-toml/39233/312
> This PEP specifies a mechanism for storing package requirements in pyproject.toml files such that they are not included in any built distribution of the project.
>
> This is suitable for creating named groups of dependencies, similar to requirements.txt files, which launchers, IDEs, and other tools can find and identify by name.
/r/Python
https://redd.it/1g0iqfr
Nginx 404'ing all images.
I'm not sure if this should be in the nginx or Django Reddit, but I'll try here first. My blog is running on Docker. Initially, all images in the static files from the first set of articles I created while coding the blog were accessible to nginx. However, when I tried adding articles from the admin panel after deployment, the new images returned a 404 error. I tried debugging by checking my code and realized I didn't include a path for the media folder in the `settings.py` file. After adding that line and rebuilding the container... well, now the previously accessible images are returning a 404. I think my nginx server might not be configured correctly. *I've entered the container and verified that files are present*
Dockerfile:
# Use the official Python image from the Docker Hub
FROM python:3.11
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set the working directory
WORKDIR /app
# Copy the requirements file into the container
COPY requirements.txt /app/
# Install the dependencies
RUN pip install --upgrade pip && pip install -r requirements.txt
# Copy the entire project into the container
COPY . /app/
# Collect static files
RUN python manage.py collectstatic --noinput
EXPOSE 1617
# Run the Gunicorn server
CMD ["gunicorn", "redacted.wsgi:application", "--bind", "0.0.0.0:1617"\]
docker-compose:
version: '3'
services:
web:
build: .
command: gunicorn --workers
/r/django
https://redd.it/1g0cvxr
[blog post] Hugging Face + Dask for parallel data processing and model inference
Wanted to share a blog post on using Hugging Face with Dask to process the FineWeb dataset. The example goes through:
* Reading directly from Hugging Face with Dask, eg `df = dask.dataframe.read_parquet(hf://...)`
* Using a Hugging Face Language Model to classify the educational level of the text.
* Filtering the highly educational web pages as a new dataset and writing in parallel directly from Dask to Hugging Face storage.
The example goes through processing a small subset of the FineWeb dataset with pandas and then scaling out to multiple GPUs with Dask.
Blog post: [https://huggingface.co/blog/dask-scaling](https://huggingface.co/blog/dask-scaling)
/r/Python
https://redd.it/1fzyh7x
ParScrape v0.4.6 Released
# What My project Does:
Scrapes data from sites and uses AI to extract structured data from it.
# Whats New:
Added more AI providers
Updated provider pricing data
Minor code cleanup and bug fixes
Better cleaning of HTML
# Key Features:
Uses Playwright / Selenium to bypass most simple bot checks.
Uses AI to extract data from a page and save it various formats such as CSV, XLSX, JSON, Markdown.
Has rich console output to display data right in your terminal.
# GitHub and PyPI
PAR Scrape is under active development and getting new features all the time.
Check out the project on GitHub or for full documentation, installation instructions, and to contribute: [https://github.com/paulrobello/par\_scrape](https://github.com/paulrobello/par_scrape)
PyPI https://pypi.org/project/par\_scrape/
# Comparison:
I have seem many command line and web applications for scraping but none that are as simple, flexible and fast as ParScrape
# Target Audience
AI enthusiasts and data hungry hobbyist
/r/Python
https://redd.it/1g06arb
Thursday Daily Thread: Python Careers, Courses, and Furthering Education!
# Weekly Thread: Professional Use, Jobs, and Education 🏢
Welcome to this week's discussion on Python in the professional world! This is your spot to talk about job hunting, career growth, and educational resources in Python. Please note, this thread is not for recruitment.
---
## How it Works:
1. Career Talk: Discuss using Python in your job, or the job market for Python roles.
2. Education Q&A: Ask or answer questions about Python courses, certifications, and educational resources.
3. Workplace Chat: Share your experiences, challenges, or success stories about using Python professionally.
---
## Guidelines:
- This thread is not for recruitment. For job postings, please see r/PythonJobs or the recruitment thread in the sidebar.
- Keep discussions relevant to Python in the professional and educational context.
---
## Example Topics:
1. Career Paths: What kinds of roles are out there for Python developers?
2. Certifications: Are Python certifications worth it?
3. Course Recommendations: Any good advanced Python courses to recommend?
4. Workplace Tools: What Python libraries are indispensable in your professional work?
5. Interview Tips: What types of Python questions are commonly asked in interviews?
---
Let's help each other grow in our careers and education. Happy discussing! 🌟
/r/Python
https://redd.it/1g05xw8
in 2024 learn flask or django?
hi everyone, i was wonder which one of these frameworks is better and worth to learn and make money? flask? django? or learn both?
/r/flask
https://redd.it/1fzsrct
Speeding up unit tests in CI/CD
I have a large Django project that currently takes ca. 30 minutes to run all the unit tests serially in our CI/CD pipeline and we want speed this up as it's blocking our releases.
I have a Ruby background and am new to Python - so I'm investigating the options available in the Python ecosystem to speed this up. So far I've found:
[pytest-xdist](https://pypi.org/project/pytest-xdist/)
pytest-split
[pytest-parallel](https://pypi.org/project/pytest-parallel/)
pytest-run-parallel
[tox](https://tox.wiki/en/latest/index.html) parallel (not exactly what I need, as I only have one environment)
CircleCI's test splitting - I've used this for Ruby, and it didn't do so well when some classes had a lot of tests in them
I'd love to hear your experiences of these tools and if you have any other suggestions.
/r/Python
https://redd.it/1fzreee
N Jurgen Schmidhuber on 2024 Physics Nobel Prize
The NobelPrizeinPhysics2024 for Hopfield & Hinton rewards plagiarism and incorrect attribution in computer science. It's mostly about Amari's "Hopfield network" and the "Boltzmann Machine."
1. The Lenz-Ising recurrent architecture with neuron-like elements was published in 1925 . In 1972, Shun-Ichi Amari made it adaptive such that it could learn to associate input patterns with output patterns by changing its connection weights. However, Amari is only briefly cited in the "Scientific Background to the Nobel Prize in Physics 2024." Unfortunately, Amari's net was later called the "Hopfield network." Hopfield republished it 10 years later, without citing Amari, not even in later papers.
2. The related Boltzmann Machine paper by Ackley, Hinton, and Sejnowski (1985) was about learning internal representations in hidden units of neural networks (NNs) S20. It didn't cite the first working algorithm for deep learning of internal representations by Ivakhnenko & Lapa. It didn't cite Amari's separate work (1967-68) on learning internal representations in deep NNs end-to-end through stochastic gradient descent (SGD). Not even the later surveys by the authors nor the "Scientific Background to the Nobel Prize in Physics 2024" mention these origins of deep learning. (BM also did not cite relevant prior work by Sherrington & Kirkpatrick &
/r/MachineLearning
https://redd.it/1fzw5b1
Thoughts on hosting
Hello!
I've got experience with hosting wagtail/Django on heroku, I liked how easy it is to set things up and add postgres db for example.
Do you have any recommendations based on ease of use and cost? :) thanks
/r/django
https://redd.it/1g186wy
Pyinstrument v5.0 - flamegraphs for Python!
Hi reddit! I've been hard at work on a new pyinstrument feature that I'm really excited to show off. It's a completely new HTML renderer that lets you see visually exactly what happened as the program was running.
What it does First, some context: Pyinstrument is a statistical profiler for Python. That means you can activate it when you're running your code, and pyinstrument will record what happens periodically, and at the end, give you a report that tells you where the time was spent.
Target Audience Anyone wondering if their Python program could be faster! Not only is it useful from a performance perspective, it's also a nice way to understand what's going on when a program runs.
Comparison If you've used profilers like cProfile before, pyinstrument aims to be a more user-friendly, intuitive alternative to that. It's also a statistical profiler, it only samples your program periodically, so it shouldn't slow the program down too much.
So, what's new? Up until now, the output has been some form of call stack. That's great to identify the parts of code that are taking the most time. But it can leave some information missing - what's the pattern of the code execution? What order
/r/Python
https://redd.it/1g1az6i
Friday Daily Thread: r/Python Meta and Free-Talk Fridays
# Weekly Thread: Meta Discussions and Free Talk Friday 🎙️
Welcome to Free Talk Friday on /r/Python! This is the place to discuss the r/Python community (meta discussions), Python news, projects, or anything else Python-related!
## How it Works:
1. Open Mic: Share your thoughts, questions, or anything you'd like related to Python or the community.
2. Community Pulse: Discuss what you feel is working well or what could be improved in the /r/python community.
3. News & Updates: Keep up-to-date with the latest in Python and share any news you find interesting.
## Guidelines:
All topics should be related to Python or the /r/python community.
Be respectful and follow Reddit's Code of Conduct.
## Example Topics:
1. New Python Release: What do you think about the new features in Python 3.11?
2. Community Events: Any Python meetups or webinars coming up?
3. Learning Resources: Found a great Python tutorial? Share it here!
4. Job Market: How has Python impacted your career?
5. Hot Takes: Got a controversial Python opinion? Let's hear it!
6. Community Ideas: Something you'd like to see us do? tell us.
Let's keep the conversation going. Happy discussing! 🌟
/r/Python
https://redd.it/1g0ww31
In a API Rest World, what do you choose? Blueprints or Flask-Views? Why?
/r/flask
https://redd.it/1g0sl6t
Generating nice iPython notebooks diffs with Git pre-commit hooks
https://preview.redd.it/u4e94ccihztd1.png?width=897&format=png&auto=webp&s=28a6d23da591912c6aa712556731798ddbfa9c7c
I like to use iPython notebooks to store experimental code and debugging results, but it's a pain to use version control to look at them.
So I wrote some pre-commit hooks that makes it easy to diff iPython notebooks in Git. It auto-generates a copy of the file with just the Python code, so that you can just inspect code changes.
I wrote a bit more about why here, along with instructions on how to use them: https://blog.moonglow.ai/diffing-ipython-notebook-code-in-git/
And the git repo for the hooks (MIT-licensed) is here: https://github.com/moonglow-ai/pre-commit-hooks
/r/IPython
https://redd.it/1g0rjpo
PSA: If you're starting a new project, try astral/uv!
It's really amazing, complex dependencies are resolved in mere miliseconds, it manages interpreters for you and it handles dev-dependencies and tools as good if not better than poetry. You are missing out on a lot of convenience if you don't try it. check it out here.
Not affiliated or involved in any way btw, just been using it for a few months and am still blown out of the water by how amazing uv and ruff are.
/r/Python
https://redd.it/1g0imjf
What I Learned from Making the Python Back End for My New Webapp
I learned a lot from making this, and think a lot of it would be interesting to others making web apps in Python:
https://youtubetranscriptoptimizer.com/blog/02\_what\_i\_learned\_making\_the\_python\_backend\_for\_yto
/r/Python
https://redd.it/1g0jybv
How do I make a number column that automatically increases but only for a group of three column?
The model:
class Bunny(models.Model):
lear = models.CharField(maxlength=10)
rear = models.CharField(maxlength=10)
sex = models.CharField(maxlength=1, choices=[('M', 'Male'), ('F', 'Female')])
tattooedon = models.DateTimeField(null=True)
CID = models.ForeignKey(Club, ondelete=models.PROTECT)
UIDowner = models.ForeignKey("authservice.User", ondelete=models.PROTECT)
TID = models.ForeignKey("coverslip.Throw", ondelete=models.PROTECT, null=True, blank=True )
RID = models.ForeignKey("bunnies.Race", ondelete=models.PROTECT)
COLID = models.ForeignKey("bunnies.Color", ondelete=models.PROTECT, null=True)
UIDtattoomaster = models.ForeignKey("authservice.User", ondelete=models.PROTECT, relatedname='tattooedbunnies')
serialnumber = models.PositiveIntegerField(null=True, blank=True)
def str(self):
return self.lear + " / " + self.rear
class Meta:
pass
/r/django
https://redd.it/1g0gjp8
Considering moving from Flask-Sqlalchemy to Flask and plain Sqlalchemy: not sure how to start, or if useful
Hi all,
I wrote a free language-learning tool called Lute. I'm happy with how the project's been going, I and a bunch of other people use it.
I wrote Lute using Flask, and overall it's been very good. Recently I've been wondering if I should have tried to avoid Flask-Sqlalchemy -- I was over my head when I started, and did the best I could.
My reasons for wondering:
- when I'm running some service or domain level tests, eg., I'm connecting to the db, but I'm not using Flask. It's just python code creating objects, calling methods, etc. The tests all need an app context to execute, but that doesn't seem like it's adding anything.
- simple data crunching scripts have to call the app initializer, and again push an app context, when really all I need is the service layer and domain objects. Unfortunately a lot of the code has stuff like "from lute.db import db" and "db.session.query() ...", etc, so the db usage is scattered around the code.
Today I hacked at changing it to plain sql alchemy, but it ended up spiralling out of my control, so I put that on ice to think a bit more.
These
/r/flask
https://redd.it/1g0ajo0
What to use instead of callbacks?
I have a lot of experience with Python, but I've also worked with JavaScript and Go and in some cases, it just makes sense to allow the caller to pass a callback (ore more likely a closure). For example to notify the caller of an event, or to allow it to make a decision. I'm considering this in the context of creating library code.
Python lambdas are limited, and writing named functions is clumsier than anonymous functions from other languages. Is there something - less clumsy, more Pythonic?
In my example, there's a long-ish multi-stage process, and I'd like to give the caller an opportunity to validate or modify the result of each step, in a simple way. I've considered class inheritance and mixins, but that seems like too much setup for just a callback. Is there some Python pattern I'm missing?
/r/Python
https://redd.it/1g02dtg
D Why is there so little statistical analyses in ML research?
Why is it so common in ML research to not do any statistical test to verify that the results are actually significant? Most of the times, a single outcome is presented, instead of doing multiple runs and performing something like a t-test or Mann Whitney U Test etc. Drawing conclusions based on a single sample would be impossible in other disciplines, like psychology or medicine, why is this not considered a problem in ML research?
Also, can someone recommend a book for exactly this, statistical tests in the context of ml?
/r/MachineLearning
https://redd.it/1fznaa9
Static files serving - S3 bucket alternative
Hello guys. I wanted to build an app with angular in frontend and Django in backend. The client must be able to click on a link and download a pdf. But the user must login to enter the app. How can I serve these pdfs? A friend told me about an S3 bucket. But is there any open source alternative for this? Is there any better solution? How to better integrate this solution with my Django authentication?
/r/django
https://redd.it/1fzuosv
How should I get started with Django?
I recently started to work with Django but I'm completely utterly humbled and devastated at the same time whenever I try to add a new function with an API call into my react project. I really don't understand the magic behind it and usually need to get help from other colleagues.
The Django documents are (I'm sorry) terrible. The more I read into it the more questions arise.
Are there any sources that can give me a better insight on how to work with API in Django and maybe API in general?
I appreciate any sources given (except Django docs)
/r/djangolearning
https://redd.it/1g02p1b
PEP 760 – No More Bare Excepts
PEP 760 – No More Bare Excepts
This PEP proposes disallowing bare except:
clauses in Python’s exception-handling syntax.
- https://peps.python.org/pep-0760/
- https://discuss.python.org/t/pep-760-no-more-bare-excepts/
/r/Python
https://redd.it/1fzxwj3
What personal challenges have you solved using Python? Any interesting projects or automations?
Hey everyone! I'm curious—what have you used Python for in your daily life? Are there any small, repetitive tasks you've automated that made things easier or saved you time? I'd love to hear about it!
I stumbled upon an old article on this Python a while ago. I think it's worth revisiting this topic about it again.
/r/Python
https://redd.it/1fzupwm