PCLZIP_ERR_BAD_FORMAT (-10) : Unable to find End of Central Dir Record signature

If you are trying to upgrade, and suddenly start getting errors like:

Incompatible Archive. PCLZIP_ERR_BAD_FORMAT (-10) : Unable to find End of Central Dir Record signature

Here are some things to check.

  • This basically means that WordPress is not able to unzip the downloaded packages to install the upgrades.
  • Check to see available space on your disk. If your disk is full you should try upgrading your hosting package or freeing up some space on the disk. One easy way might be to delete old image thumbnails in sizes you no longer use.
  • If you have enough space on your disk, it may be that specific downloaded packages may be problematic. Try to install a plugin you don’t have, and if that works, you should check to see if there is a mysterious folder called “Upgrades” [DO NOT CONFUSE WITH UPLOADS]. If this folder exists, it is worth deleting it to see ii that lets  you do the upgrade.
Enhanced by Zemanta

Free space in your WordPress install by deleting old image sizes

If you change your theme often, your uploads folder will accumulate thumbnails of images in many sizes that you no longer use. This consumes disk space unnecessarily. I wish someone coded a plugin for this, but failing that, a handy way to do this via SSH is:

find . -name *-250x250.* | xargs rm -f

Where 250×250 is the image size you want to delete. You could also try something like:

find . -name *-250x*.* | xargs rm -f

if you have multiple thumbnail sizes like 250×250 250×300 etc.

What I do is list images in the folders to see the unwanted sizes there, and run this delete a few times with various sizes. A more ruthless person could try something like:

find . -name *-*x*.* | xargs rm -f

I do not recommend this, as it can match several files that you may not want to delete, for example a file with a hyphenated name and the letter x in the last hyphenated word, like “wish-merry-xmas.jpg” for example, which wouldn’t be a resized image, but an original or worse, it could be another file and not an image at all, like “here-are-exact-directions-to-our-property.html”.

But if you have a lot of thumbnail sizes, you may feel tempted anyway. Two suggested precautions. Change directory to your uploads folder (you should do this in any case)
cd /path/to/wprdpress/wp-content/uploads
find . -name *-*x*.* | xargs rm -f

The other precaution to take is to specify extension.
find . -name *-*x*.jpg | xargs rm -f
find . -name *-*x*.png | xargs rm -f

This will give you some protection from inadvertently deleting non-resize uploads like “entertaining-extras.pdf”

of course, if you are a patient soul (or don’t have too many files uploaded), you could find the files before deleting to see if any other files are getting selected along with resizes.

find . -name *-*x*.*
and if all is well
find . -name *-*x*.* | xargs rm -f

Do you have a better method?

Enhanced by Zemanta

How to autopost your blogs and Twitter feed to Google Plus pages

HEADS UP!!! Hootsuite is offering a very cool 50% off on a pro membership (which allows you unlimited autoposts). Use Coupon Code: HOOTXA90

Please note that this does not work with Google Plus profiles, only pages – like this Vidyut page. Google has recently released an API to select partners, which allows posting to google plus pages. Of these, hootsuite is the only one that appears to be free. You can also use it as your social media manager, as it supports scheduling posts and viewing and posting to several streams simultaneously.

Many of us are not yet active on Google Plus, but would like to distribute our content there. Currently, the free account in Hootsuite allows you to have 5 social media profiles and 2 RSS streams, which can post to one social media profile in an automated manner. It is a simple matter to add your google plus page and plug in your blog feed to it. However, plugging in your twitter updates is not so simple.

You will need a server where you can serve php files. I have no idea how secure (or not) the script is, so it is best if it is isolated from any production site you have. As a subdomain or something. If you don’t have your own website, you could easily sign up for some free webhosting, which is a limited account usually, but should be adequate for this purpose as long as it supports curl. Additionally, security will not be such a big risk, considering that there will be nothing beyond the script to get at. The specifications will mention if curl is available, or you can simply upload the script and see if it works. It will work if its needs are met. If not, it is a free account anyway. Try another place. You could also buy hosting, if you like spending money.

I am using Tweetledee, a fairly primitive application for grabbing feeds off Twitter. You can download it here. The website has Tweetledee Usage Examples for setting it up, but if you are somewhat skilled, here is what you have to do.

Create a Twitter application from your Twitter developer site (you will have to login with your Twitter account). You only need read access, so default settings should be fine. For the application url, plug in the url to wherever you are planning to host your application.

Download the Tweetledee application and unpack it onto your computer. You will only be uploading the tweetledee folder inside, but first, add your application details from Twitter into the file tweetledee_keys.php, which can be found in tweetledee/tldlib/keys/.

Surf to http://your.script.host/tweetledee-folder-if-not-root/userrss.php If everything is working, you will see an RSS feed of your Twitter posts here that you can plug into your hootsuite for automated posting. By default, you get 25 posts, but you can increase up to 200 by adding a parameter “c” like:
You can get another user’s timeline by adding their handle as a parameter (mine, in the example):
http://your.script.host/tweetledee-folder-if-not-root/userrss.php?c=200&user=vidyut
You can exclude the replies:
http://your.script.host/tweetledee-folder-if-not-root/userrss.php?xrp=1

However…. several things and their fixes.

The Tweetledee script is fairly primitive in the sense that it simply gives you your timeline as RSS. It does not cache the results currently, so simply refreshing the page a few times may put you out of your API limit. It should not be such a problem if the script is called only when needed, but if the page is available on the internet, there will be spiders and bots and possibly other traffic that may max you out and make your feed unavailable for Hootsuite. A simple fix for this is to check your logs after the script is accessed by Hootsuite for the first time and configure your server to only serve to their user-agent and return a 403 for all other attempts to access it. This works brilliantly. Not to mention it prevents others from misusing your tweetledee tools for their purposes (The scripts do several things, like pull search results or timelines of other users – read documentation).

The Tweetledee application gives you your timeline as seen on your profile page and directly plugging it into hootsuite will post a lot of unnecessary updates (which is not desirable, since you can only make 5 updates an hour with a free account). Besides, if you are a social media addict to the point of seeking a fix like this, you probably make way more than five tweets an hour. So we hack this further.

Now the method bifurcates a bit. If you are paying for a hootsuite account (HootSuite Pro Free Trial) then you will likely not have to worry about the five tweets an hour limit. (Please note that you should limit the access to the application by user-agent or you will likely keep running out of api calls)

We can refine our tweet stream with Yahoo pipes

However, if you are only using a free account, all is not lost. You can still get your best tweets on your Google Plus page, even if not all. To do this, instead of using your profile feed, you use the search feed by “popular” for your handle, which will have tweets you make as well as replies and mentions.

http://your.script.host/tweetledee-folder-if-not-root/searchrss.php?q=vidyut&rt=popular

The best part is that they will be organized so that the “top” tweets – which are your best viewed tweets – show first. You will be doing a considerable amount of weeding of this RSS, so be sure to grab the maximum number of tweets the api allows, which is 200.

Plug this RSS into your Yahoo Pipes, and filter out tweets not made by you as well as any tweets made by you that begin with “@” (which means they are replies to someone). You are left with pure gold. You can additionally sort these tweets by time of publishing to get something that resembles a timeline of your best tweets, or you can use it as it is, so that your best tweets are posted on priority regardless of when you tweeted them. This will largely depend on your tweeting habits. If you tweet like conversation, you may want them in sequence. If you tweet individual tweets, you may want the best leading.

You can also experiment with other ideas like marking tweets you want to share on your Google Plus page as favorites, and using the favorites feed:
http://your.script.host/tweetledee-folder-if-not-root/favoriterss.php?c=200&user=vidyut

Now, when you output this pipe as RSS and plug it into hootsuite, your Google plus page will keep updating with your best tweets (or all, if you are paying to remove limits and not selecting only best tweets). While it will only update five tweets an hour, it will not repeat items it has already tweeted, so if you have made less than 5 good tweets in that hour, your backlog will slowly get cleared, as it also will, every hour when you are offline.

Hope this helps.

If you got a better idea, tell me. Also let me know if anything does not work.

Fix: Realtek SD card reader not working with Ubuntu

I have a RTS5209 Realtek SD card reader on my laptop, which was not reading any card inserted. The following instructions are likely to work for other versions like RTS5229 or RTS5289 as well.

Turns out I needed to install the driver.

Realtek offers drivers on its website for Ubuntu/Linux. Download the driver and unpack into a temporary directory.

Open a terminal and cd into the directory you unpacked the file.

Then enter one after the other:

make

sudo make install

sudo depmod

You will have to reboot your computer for the driver to load, or you can enter this command:
sudo modprobe rts_pstor

This is for RTS5209. I believe for RTS5229 it is (hint:read folder name):
sudo modprobe rts5229

However, this only works on older kernels and on 3.9+ you get errors like:

vidyut@vidyut-Compaq-435-Notebook-PC:~/tmp/rts_pstor$ make
sed "s/RTSX_MK_TIME/`date +%y.%m.%d.%H.%M`/" timestamp.in > timestamp.h
cp -f ./define.release ./define.h
make -C /lib/modules/3.8.0-26-generic/build/ SUBDIRS=/home/vidyut/tmp/rts_pstor modules
make[1]: Entering directory `/usr/src/linux-headers-3.8.0-26-generic'
CC [M] /home/vidyut/tmp/rts_pstor/rtsx.o
/home/vidyut/tmp/rts_pstor/rtsx.c:916:22: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘rtsx_probe’
/home/vidyut/tmp/rts_pstor/rtsx.c:1080:23: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘rtsx_remove’
/home/vidyut/tmp/rts_pstor/rtsx.c:1106:11: error: ‘rtsx_probe’ undeclared here (not in a function)
/home/vidyut/tmp/rts_pstor/rtsx.c:1107:2: error: implicit declaration of function ‘__devexit_p’ [-Werror=implicit-function-declaration]
/home/vidyut/tmp/rts_pstor/rtsx.c:1107:24: error: ‘rtsx_remove’ undeclared here (not in a function)
/home/vidyut/tmp/rts_pstor/rtsx.c:485:12: warning: ‘rtsx_control_thread’ defined but not used [-Wunused-function]
/home/vidyut/tmp/rts_pstor/rtsx.c:596:12: warning: ‘rtsx_polling_thread’ defined but not used [-Wunused-function]
/home/vidyut/tmp/rts_pstor/rtsx.c:745:13: warning: ‘quiesce_and_remove_host’ defined but not used [-Wunused-function]
/home/vidyut/tmp/rts_pstor/rtsx.c:780:13: warning: ‘release_everything’ defined but not used [-Wunused-function]
/home/vidyut/tmp/rts_pstor/rtsx.c:790:12: warning: ‘rtsx_scan_thread’ defined but not used [-Wunused-function]
/home/vidyut/tmp/rts_pstor/rtsx.c:816:13: warning: ‘rtsx_init_options’ defined but not used [-Wunused-function]
cc1: some warnings being treated as errors
make[2]: *** [/home/vidyut/tmp/rts_pstor/rtsx.o] Error 1
make[1]: *** [_module_/home/vidyut/tmp/rts_pstor] Error 2
make[1]: Leaving directory `/usr/src/linux-headers-3.8.0-26-generic'
make: *** [default] Error 2
vidyut@vidyut-Compaq-435-Notebook-PC:~/tmp/rts_pstor$

The fix for this is to edit the file rtsx.c (that is throwing the errors) and remove “__devinit” “__devexit” and “__devexit_p” from it, leaving the rest of the code (and the rest of the line they occur in) in tact.

Like so:

static int __devinit rtsx_probe(struct pci_dev *pci, const struct pci_device_id *pci_id)

Becomes:

static int rtsx_probe(struct pci_dev *pci, const struct pci_device_id *pci_id)

static void __devexit rtsx_remove(struct pci_dev *pci)

Becomes:
static void rtsx_remove(struct pci_dev *pci)

.remove = __devexit_p(rtsx_remove),

Becomes:
.remove = rtsx_remove,

Now, when you run make, the process should complete without errors.

If your SD card *still* doesn’t mount, check your dmesg like so:

sudo dmesg

If it shows something like:
[ 694.058432] systemd-hostnamed[2516]: Warning: nss-myhostname is not installed. Changing the local hostname might make it unresolveable. Please install nss-myhostname!

Try installing the library.
apt-get install libnss-myhostname

If it still doesn’t work, I don’t know what to do. This SHOULD work.

Add source url to content copied from your site

Ever wonder how the url gets added when you copy something on some websites? There are plugins and websites that do it (for money or registration), but they also appear to track clicks, so I was a bit paranoid. It isn’t so difficult and is very useful.

Basically, what happens is that whenever content from the page is copied to the clipboard, the url (or other text you want) gets added to the copied content automatically. This is useful if someone copies content from your website – often for email (as it turns out in my logs). Your page where it got copied from gets added automatically.

I have also found it useful when I want to tweet quotes from an article on the site. Instead of copying the quote and url separately, selecting the text I want gives me the content for the tweet.

Here is the javascript I am using. Feel free to edit what will be added to the content (change the square brackets before using – the code isn’t getting posted as content as javascript).

[script type="text/javascript"]
function addLink() {
var body_element = document.getElementsByTagName('body')[0];
var selection;
selection = window.getSelection();
var pagelink = "
Source: "+document.location.href; // change this if you want
var copytext = selection + pagelink;
var newdiv = document.createElement('div');
newdiv.style.position='absolute';
newdiv.style.left='-99999px';
body_element.appendChild(newdiv);
newdiv.innerHTML = copytext;
selection.selectAllChildren(newdiv);
window.setTimeout(function() {
body_element.removeChild(newdiv);
},0);
}
document.oncopy = addLink;
[/script]

Redirect www to non-www on Nginx

How to redirect the www.domain.com version of your website to domain.com version on Nginx?

Here is how. You basically have to make two server blocks. The block with your normal configuration should be the version you want, the version you want to redirect should have a simple rewrite rule alone.

For example, to redirect www.vidyut.net to vidyut.net, you set up your server blocks like this:
server {
server_name www.vidyut.net;
rewrite ^(.*) http://vidyut.net$1 permanent;
}

server {
server_name vidyut.net;
# Your
# normal
# server
# configuration
# goes
# here
}

Random Tweet Assistant if you hate sickular people

After getting thoroughly fed up of highly repetitive trolls not even able to insult with any creativity, here is a writing aid to help single-digit IQs to insult “sickulars” with some flair at least. I mean, if you must look like an idiot, at least look like pretty idiot, yes?

Most of these are conversation aids, perhaps not proper insults, but at least they give you something to say, beyond “you anti-national,you!”

While right wing trolls are an overwhelming area of boredom, there are others. I’ll check how this page goes before creating more insult assistants.

Feel free to add suggestions (with or without source) in the comments.

How to install ioncube loader on Ubuntu in one line of code

To install ioncube loader on Ubuntu, AS ROOT paste this line:

cd /usr/local && wget http://downloads2.ioncube.com/loader_downloads/ioncube_loaders_lin_x86-64.tar.gz && tar xzf ioncube_loaders_lin_x86-64.tar.gz && echo "zend_extension=/usr/local/ioncube/ioncube_loader_lin_5.3.so" | sudo tee /etc/php5/conf.d/ioncube.ini

If you are using nginx, that would be:

cd /usr/local && wget http://downloads2.ioncube.com/loader_downloads/ioncube_loaders_lin_x86-64.tar.gz && tar xzf ioncube_loaders_lin_x86-64.tar.gz && echo "zend_extension=/usr/local/ioncube/ioncube_loader_lin_5.3.so" | sudo tee /etc/php5/fpm/conf.d/ioncube.ini

Please note that if you are using symlinks and maintaining a single php.ini and conf.d folder instead of separate ones for php5 and php5-fpm (good idea if you switch between apache2 and nginx), either one of the lines will work.

Restart php5 or php5-fpm as the case may be.

sudo service apache2 restart

or

sudo service php5-fpm restart

Chromium starts about:blank tabs on typing into dashboard in Ubuntu

This is a strange and inexplicable error. Typing into the Ubuntu dashboard spawns new tabs in Chromium. This happens even if Chromium is not running it simply launches and spawns one or more tabs.

Sometimes this also happened when software updater started.

I had Chromium as the default browser earlier and made Firefox the default to fix this problem. It did not work.

What worked was logging out and logging back into my Google chat account in Online Accounts.

No idea why this is so, but turns out that it is a Chromium bug and what I did intuitively (after trying many other things that didn’t work) was a fix.

Nginx: upstream timed out (110: Connection timed out)

Error 110: Connection timed out while reading response header from upstream

Sometimes a Nginx web server seems to load pages with php code with a lot of xml parsing really slowly. Often it doesn’t load or connection times out. This will be seen more on pages where the php code parses through large xml files and outputs data only when all the parsing is complete. The Nginx web server times out before php returns output.

We have to make the Nginx web server wait more before giving up on the upstream.

This is a typical error I get:

upstream timed out (110: Connection timed out) while reading response header from upstream

or

connect() to unix:/var/run/php5-fpm.sock failed (2: No such file or directory) while connecting to upstream

and such. The second one is inexplicable, since everything is working when not timing out, but I have often seen these two together.

The fix that works is increasing the timeout.

In the server configuration /etc/nginx/nginx.conf on Ubuntu/Debian, in the http {....} block, add the line (or edit the commented out line)
fastcgi_read_timeout 300s;

Restart the Nginx web server (as root or with sudo).
service nginx restart

There should be an immediate improvement when parsing large xml files. If you are still having problems, raise the number till resolved.

English: Nginx Logo Español: Logo de Nginx

English: Nginx web server Logo Español: Logo de Nginx (Photo credit: Wikipedia)

Enhanced by Zemanta