Skip to content

Commit

Permalink
x
Browse files Browse the repository at this point in the history
x

Update readme.md

Update readme.md
  • Loading branch information
dg committed Jun 18, 2024
1 parent 7c45a88 commit e4221c9
Showing 1 changed file with 8 additions and 2 deletions.
10 changes: 8 additions & 2 deletions readme.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
RobotLoader: comfortable autoloading
====================================
[![RobotLoader](https://github.com/nette/utils/assets/194960/c33fdb74-0652-4cad-ac6e-c1ce0d29e32a)](https://doc.nette.org/en/robot-loader)

[![Downloads this Month](https://img.shields.io/packagist/dm/nette/robot-loader.svg)](https://packagist.org/packages/nette/robot-loader)
[![Tests](https://github.com/nette/robot-loader/workflows/Tests/badge.svg?branch=master)](https://github.com/nette/robot-loader/actions)
Expand Down Expand Up @@ -28,6 +27,8 @@ require_once 'Utils/Paginator.php';
...
```

 <!---->

[Support Me](https://github.com/sponsors/dg)
--------------------------------------------

Expand All @@ -37,6 +38,7 @@ Do you like RobotLoader? Are you looking forward to the new features?

Thank you!

 <!---->

Installation
------------
Expand All @@ -58,6 +60,7 @@ composer require nette/robot-loader

It requires PHP version 8.0 and supports PHP up to 8.3.

 <!---->

Usage
-----
Expand All @@ -84,6 +87,7 @@ If you want RobotLoader to skip certain directories, use `$loader->excludeDirect

By default, RobotLoader reports errors in PHP files by throwing a `ParseError` exception. This can be suppressed using `$loader->reportParseErrors(false)`.

 <!---->

PHP Files Analyzer
------------------
Expand Down Expand Up @@ -117,6 +121,7 @@ $loader->refresh();
$res = $loader->getIndexedClasses();
```

 <!---->

Caching
-------
Expand All @@ -131,6 +136,7 @@ The initial file scanning, when the cache doesn't exist yet, can naturally take
This is a situation where a large number of concurrent requests on a production server would trigger RobotLoader, and since the cache doesn't exist yet, they would all start scanning files, which would overload the server.
Fortunately, RobotLoader works in such a way that only the first thread indexes the files, creates the cache, and the rest wait and then use the cache.

 <!---->

PSR-4
-----
Expand Down

0 comments on commit e4221c9

Please sign in to comment.