[geeklog-devel] No articles on geeklog.net when not logged in?

Tom websitemaster at cogeco.net
Tue Sep 1 12:31:50 EDT 2015


If that happens I would think the overwritten cache file should contain the proper text (it did create the file in the first place…). I am not too familiar with this part of the caching system though.

 

The cache files are written by the template class found in system/classes/template.class.php

 

Tom

 

From: geeklog-devel [mailto:geeklog-devel-bounces at lists.geeklog.net] On Behalf Of Dan Stoner
Sent: September-01-15 10:54 AM
To: Geeklog Development
Subject: Re: [geeklog-devel] No articles on geeklog.net when not logged in?

 

>We could have a race condition in the cache refresh logic, for example. Could 2 requests trigger it at the same time, overwriting each other? Or something like that?

 

 

What triggers the cache (re)generation?

 

Also, if someone gives a pointer to which code file implements the caching mechanism (actually writes the files) I'd be curious to take a look.

 

 

thanks,

 

- Dan Stoner

 

 

 

 

On Tue, Sep 1, 2015 at 10:46 AM, Dirk Haun <dirk at haun-online.de> wrote:

Tom wrote:

> Clear the Cache again.

Done. Things should be back to normal.


> I have monatastic.com checking Geeklog.net every hour for the past year or
> so. Once every couple of months I may get a warning that the site is down.
> Last week I got 13 warnings (3 on Sunday) but the site is always backup
> within a few minutes (I have never found it down but I have never had a
> chance to check the minute I got the email). The week before I got 3
> warnings, and the week before that 2 warnings that Geeklog.net is down.

Probably just overloaded. I've seen this happen on occasion.


> I am assuming that the server is overloaded either by bots swarming our
> website or by someone else's account. I haven't had a chance to look into
> this yet. (Dirk if you have the time???)

The site is under almost constant attack - spam attempts, attempts to create fake accounts, more or less systematic probes for SQL injections or other know vulnerabilities. Sometimes one of those overloads the site. The sheer amount of non-hostile bots indexing the site doesn't exactly help (why does every company have to run its own web crawler these days?).

Anyway, that has been the reality of the web for the last couple of years and Geeklog shouldn't just stumble over something like that so easily.

We could have a race condition in the cache refresh logic, for example. Could 2 requests trigger it at the same time, overwriting each other? Or something like that?


Dirk


--
https://www.themobilepresenter.com/

_______________________________________________
geeklog-devel mailing list
geeklog-devel at lists.geeklog.net
https://pairlist8.pair.net/mailman/listinfo/geeklog-devel

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist8.pair.net/pipermail/geeklog-devel/attachments/20150901/f3c3a907/attachment.html>


More information about the geeklog-devel mailing list