Category Archives: Uncategorized

WordPress with MariaDB instead of MySQL

So I heard good things about MariaDB and decided to switch from MySQL to MariaDB. MariaDB is a fork of MySQL developed by the original developers of MySQL and it is intended to be a drop in replacement – meaning all your commands and databases from MySQL should continue to work seamlessly after the switch.

Tall claim, but with years of relationships with webservers, it isn’t too tough to know that even an upgrade can break things. Here, however seamlessly, the DATABASE management software was being replaced. Only a complete novice would believe “as advertized” to the point of not being worried.

My biggest fear was breaking my blogs. Backups are there, but…. it is unpleasant to see your precious sites not working, and I was apprehensive.

So anyway, I did it.

Added the repository (these are my instructions, but they helpfully provide a configurator for customized MariaDB repositories for your Operating System – version – MariaDB version, which you should totally use).

sudo apt-get install software-properties-common
sudo apt-key adv --recv-keys --keyserver hkp://keyserver.ubuntu.com:80 0xcbcb082a1bb943db
sudo add-apt-repository 'deb http://mirrors.hustunique.com/mariadb/repo/5.5/ubuntu saucy main'

I’d done paranoid backups to the nth degree before, as you should too, but I won’t bore you with the details. Suffice it to say that I had 3 of each database AND a snapshot of my VPS to restore with “one click” if I got itchy AND I copied the mysql directory anyway (I really love my blogs. Really). I think this was mostly of therapeutic value after the first backup, but hey, it was good for my blood pressure.

Updated and installed MariaDB.

sudo apt-get update
sudo apt-get install mariadb-server

The only pain here was that the repository I used was agonizingly slow to download from, which really did not help my anxiety levels, since I’m used to the more blazing fast ubuntu repositories. Or perhaps it was a temporary patch of bad network I hit.

Regardless, if you are superstitious, you may want to avoid this one.

After a wait that almost had me too old to care, the installation was done.

That is it. There was no noticeable difference to my site except seeming slightly faster. I noticed the configuration file got replaced, but the defaults are good enough that the blogs are completely normal. I expect once I get around to tweaking it, the performance may get even better, but this is good already.

The backups did not get used. A textbook “drop in”. Zero hassle.

Do it already. The only cure for your wondering is finding out.

live-blogging-plus

Setting up live blogging that works with the latest version of WordPress

True to form, I want big things for my blog, but they aren’t easy. This time the idea came from a reader who wanted my commentary tweet series on current events to be done instead as a live blog, so that they could read all the ideas at once in one place and they would also remain easily accessible.

Great idea, except all the plugins I came across for live blogging were outdated and not working correctly with the latest WordPress, with the exception of 24liveblog, which I was not so keen on, as the content does not reside on my server, but theirs (though it is free and they promise never to take it down).

If you are fine with the content being hosted on another service, look no further, 24liveblog is good, and free and the resulting liveblog can be embedded on your site with a code provided.

I wanted to make the plugins work as I wanted the content to reside on my server AND I wanted it to make tweets linking to the post with each update.

So I have updated the live blogging plugin to fix several issues with tweets not getting posted to Twitter. You can download Live Blogging Plus it here.

The plugin is capable of delivering the live blog more efficiently using meteor, so I am planning to set up a meteor server for it to use. Stay tuned.

Enhanced by Zemanta

How to autopost your blogs and Twitter feed to Google Plus pages

HEADS UP!!! Hootsuite is offering a very cool 50% off on a pro membership (which allows you unlimited autoposts). Use Coupon Code: HOOTXA90

Please note that this does not work with Google Plus profiles, only pages – like this Vidyut page. Google has recently released an API to select partners, which allows posting to google plus pages. Of these, hootsuite is the only one that appears to be free. You can also use it as your social media manager, as it supports scheduling posts and viewing and posting to several streams simultaneously.

Many of us are not yet active on Google Plus, but would like to distribute our content there. Currently, the free account in Hootsuite allows you to have 5 social media profiles and 2 RSS streams, which can post to one social media profile in an automated manner. It is a simple matter to add your google plus page and plug in your blog feed to it. However, plugging in your twitter updates is not so simple.

You will need a server where you can serve php files. I have no idea how secure (or not) the script is, so it is best if it is isolated from any production site you have. As a subdomain or something. If you don’t have your own website, you could easily sign up for some free webhosting, which is a limited account usually, but should be adequate for this purpose as long as it supports curl. Additionally, security will not be such a big risk, considering that there will be nothing beyond the script to get at. The specifications will mention if curl is available, or you can simply upload the script and see if it works. It will work if its needs are met. If not, it is a free account anyway. Try another place. You could also buy hosting, if you like spending money.

I am using Tweetledee, a fairly primitive application for grabbing feeds off Twitter. You can download it here. The website has Tweetledee Usage Examples for setting it up, but if you are somewhat skilled, here is what you have to do.

Create a Twitter application from your Twitter developer site (you will have to login with your Twitter account). You only need read access, so default settings should be fine. For the application url, plug in the url to wherever you are planning to host your application.

Download the Tweetledee application and unpack it onto your computer. You will only be uploading the tweetledee folder inside, but first, add your application details from Twitter into the file tweetledee_keys.php, which can be found in tweetledee/tldlib/keys/.

Surf to http://your.script.host/tweetledee-folder-if-not-root/userrss.php If everything is working, you will see an RSS feed of your Twitter posts here that you can plug into your hootsuite for automated posting. By default, you get 25 posts, but you can increase up to 200 by adding a parameter “c” like:
You can get another user’s timeline by adding their handle as a parameter (mine, in the example):
http://your.script.host/tweetledee-folder-if-not-root/userrss.php?c=200&user=vidyut
You can exclude the replies:
http://your.script.host/tweetledee-folder-if-not-root/userrss.php?xrp=1

However…. several things and their fixes.

The Tweetledee script is fairly primitive in the sense that it simply gives you your timeline as RSS. It does not cache the results currently, so simply refreshing the page a few times may put you out of your API limit. It should not be such a problem if the script is called only when needed, but if the page is available on the internet, there will be spiders and bots and possibly other traffic that may max you out and make your feed unavailable for Hootsuite. A simple fix for this is to check your logs after the script is accessed by Hootsuite for the first time and configure your server to only serve to their user-agent and return a 403 for all other attempts to access it. This works brilliantly. Not to mention it prevents others from misusing your tweetledee tools for their purposes (The scripts do several things, like pull search results or timelines of other users – read documentation).

The Tweetledee application gives you your timeline as seen on your profile page and directly plugging it into hootsuite will post a lot of unnecessary updates (which is not desirable, since you can only make 5 updates an hour with a free account). Besides, if you are a social media addict to the point of seeking a fix like this, you probably make way more than five tweets an hour. So we hack this further.

Now the method bifurcates a bit. If you are paying for a hootsuite account (HootSuite Pro Free Trial) then you will likely not have to worry about the five tweets an hour limit. (Please note that you should limit the access to the application by user-agent or you will likely keep running out of api calls)

We can refine our tweet stream with Yahoo pipes

However, if you are only using a free account, all is not lost. You can still get your best tweets on your Google Plus page, even if not all. To do this, instead of using your profile feed, you use the search feed by “popular” for your handle, which will have tweets you make as well as replies and mentions.

http://your.script.host/tweetledee-folder-if-not-root/searchrss.php?q=vidyut&rt=popular

The best part is that they will be organized so that the “top” tweets – which are your best viewed tweets – show first. You will be doing a considerable amount of weeding of this RSS, so be sure to grab the maximum number of tweets the api allows, which is 200.

Plug this RSS into your Yahoo Pipes, and filter out tweets not made by you as well as any tweets made by you that begin with “@” (which means they are replies to someone). You are left with pure gold. You can additionally sort these tweets by time of publishing to get something that resembles a timeline of your best tweets, or you can use it as it is, so that your best tweets are posted on priority regardless of when you tweeted them. This will largely depend on your tweeting habits. If you tweet like conversation, you may want them in sequence. If you tweet individual tweets, you may want the best leading.

You can also experiment with other ideas like marking tweets you want to share on your Google Plus page as favorites, and using the favorites feed:
http://your.script.host/tweetledee-folder-if-not-root/favoriterss.php?c=200&user=vidyut

Now, when you output this pipe as RSS and plug it into hootsuite, your Google plus page will keep updating with your best tweets (or all, if you are paying to remove limits and not selecting only best tweets). While it will only update five tweets an hour, it will not repeat items it has already tweeted, so if you have made less than 5 good tweets in that hour, your backlog will slowly get cleared, as it also will, every hour when you are offline.

Hope this helps.

If you got a better idea, tell me. Also let me know if anything does not work.

Add source url to content copied from your site

Ever wonder how the url gets added when you copy something on some websites? There are plugins and websites that do it (for money or registration), but they also appear to track clicks, so I was a bit paranoid. It isn’t so difficult and is very useful.

Basically, what happens is that whenever content from the page is copied to the clipboard, the url (or other text you want) gets added to the copied content automatically. This is useful if someone copies content from your website – often for email (as it turns out in my logs). Your page where it got copied from gets added automatically.

I have also found it useful when I want to tweet quotes from an article on the site. Instead of copying the quote and url separately, selecting the text I want gives me the content for the tweet.

Here is the javascript I am using. Feel free to edit what will be added to the content (change the square brackets before using – the code isn’t getting posted as content as javascript).

[script type="text/javascript"]
function addLink() {
var body_element = document.getElementsByTagName('body')[0];
var selection;
selection = window.getSelection();
var pagelink = "
Source: "+document.location.href; // change this if you want
var copytext = selection + pagelink;
var newdiv = document.createElement('div');
newdiv.style.position='absolute';
newdiv.style.left='-99999px';
body_element.appendChild(newdiv);
newdiv.innerHTML = copytext;
selection.selectAllChildren(newdiv);
window.setTimeout(function() {
body_element.removeChild(newdiv);
},0);
}
document.oncopy = addLink;
[/script]

How to install ioncube loader on Ubuntu in one line of code

To install ioncube loader on Ubuntu, AS ROOT paste this line:

cd /usr/local && wget http://downloads2.ioncube.com/loader_downloads/ioncube_loaders_lin_x86-64.tar.gz && tar xzf ioncube_loaders_lin_x86-64.tar.gz && echo "zend_extension=/usr/local/ioncube/ioncube_loader_lin_5.3.so" | sudo tee /etc/php5/conf.d/ioncube.ini

If you are using nginx, that would be:

cd /usr/local && wget http://downloads2.ioncube.com/loader_downloads/ioncube_loaders_lin_x86-64.tar.gz && tar xzf ioncube_loaders_lin_x86-64.tar.gz && echo "zend_extension=/usr/local/ioncube/ioncube_loader_lin_5.3.so" | sudo tee /etc/php5/fpm/conf.d/ioncube.ini

Please note that if you are using symlinks and maintaining a single php.ini and conf.d folder instead of separate ones for php5 and php5-fpm (good idea if you switch between apache2 and nginx), either one of the lines will work.

Restart php5 or php5-fpm as the case may be.

sudo service apache2 restart

or

sudo service php5-fpm restart

How to create an admin area on fake domain

Often, you need to have areas of your site that you access through a browser, that you don’t want anyone else to access. These can be control panels or scripts with phpinfo or your apc cache monitoring script.

what I do is create a fake domain for these and give them their own site. For example a virtual host for domain “admin.area” which of course is not a real domain name or registered anywhere. Remove the index file, forbid browsing folders and place your scripts there.

Plug in your server IP address with the fake domain you created (admin.area) into the hosts file of any computer that needs access.

If you are using varnish, you can also redirect any queries to this domain to your main website for safety and access through the backend port.

You can easily disable the virtual domain altogether when done without interfering with your production sites in any way and risking accidental permission changes or other problems.

How to forward only .onion or .i2p urls selectively

If you are like me and don’t need to use TOR so much for anonymity as for accessing interesting hidden sites, the overall slow speed of TOR probably bothers you for normal browsing. The need to toggle proxies or start two separate browsers probably bugs you too.

Now I have found .i2p which is similar to TOR in the sense of being an anonymous, decentralized network, but it is not a proxy at all – which means configuring the browser to forward queries via the i2p network means that regular sites won’t work at all!

I need:

  • Normal browsing directly over the internet for regular urls
  • Routing .onion urls to TOR
  • Routing .i2p urls to I2P

I dare say this fix will also work for any other such networks I may not know of.

Before we begin, an extremely important WARNING:

If you use TOR for Anonymity, then this “how to” may compromise your anonymity for the same reasons the TORbutton was discontinued. Mixing normal browsing with TOR may result in leaks of identity or inadvertent access of normal urls which you intend to use over TOR directly – which could be disastrous, particularly if your safety depends on it. YOU ARE WARNED. This “How To” ASSUMES THAT YOU DO NOT HAVE ANONYMITY AS A PRIORITY.

There are many people who would simply like to see various sites, or have TOR sites they interact with regularly – like TORMail users, for example. Or people who have set up their own sites or are reading forums or otherwise engaged in activity that they don’t think will be legally problematic.

This guide is for them. And I hope their tribe increases.

Here’s how to see normal urls directly on the internet, while using TOR and I2P for .onion and .i2p sites respectively.

Step 1: Install Privoxy. Privoxy is a transparent proxy that does a lot of other useful things too. Visit their website, read up, and if you like (and want to continue this how to) install privoxy as per instructions for your OS. This assumes that you already have I2P and/or TOR installed – or you can go now and do it, or waste your time reading this post. I use TOR installed from the PPA and not the browser bundle – so that it can run in the background and be used as needed – or stopped independently of the browser if I’m not using TOR. This is important, as I browse all day, and having the browser shut down if I’m not using TOR would be most inconvenient. Wasting bandwidth on TOR unnecessarily is not required either.

Step 2: Configure your browser to use Privoxy – as per instructions given on their site. Basically, this means setting your network proxy to 127.0.0.1:8118 <– this is the port for Privoxy. Note DO NOT add the socks proxy for TOR here. Or the i2p settings.

Step 3: Edit your Privoxy configuration file. On Ubuntu, installing from the PPA it is found at /etc/privoxy/config – your mileage may vary depending on OS and how you installed privoxy, however, it will be found in the root of the folder for privoxy – as a rough guide.

Step 4: At the end of that file, add:

forward-socks5   .onion               127.0.0.1:9050 .
forward   .i2p               127.0.0.1:4444 .

Done.

That is it.

Now, your normal browsing will be unaffected by either TOR or I2P Network, while .onion and .i2p urls will get forwarded correctly and accessible seamlessly.

Enjoy!

NOTE: While this does nothing special to compromise your safety, it may be compromising some protective feature in case someone is trying to find out your identity or something. I have no clue what it does on the safety front, and I highly recommending assuming that this is unsafe till some superior being can verify or suggest better methods.

If, while using this method, you find that you need to use anonymous features of TOR, I highly recommend starting a “TOR Browser Bundle” browser separately. This can be configured to use different ports so as to not interfere with your already installed TOR.

NOTE1: I know I am being repetitive, but I feel I must, seeing as how a lot of people use TOR for activism or other things where getting identity compromised could land them in a lot of trouble.

Install APC and fix “potential cache slam” problem

APC is an Alternative PHP Cache or Opcode cache that speeds up performance dramatically by caching queries.

How to install APC?

apt-get install php5-apc

You will have to enble it in the php configuration. Add the following to your php.ini file

extension=apc.so

Alternatively, you could create a separate apc.ini file and put it in the conf.d directory.

If your error log shows a lot of messages about potential cache slam averted, it is a bug. Not much you can do about it but you can turn slam defense off so that it doesn’t spam your logs (or cause other fails)

Add after that line in your php.ini or apc.ini

apc.write_lock = 1
apc.slam_defense = 0

Permission denied: make_sock: could not bind to address 0.0.0.0:80

Did you just type

service apache2 restart

or

/etc/init.d/apache2 restart

and get stuff like:

* Starting web server apache2 ulimit: 88: error setting limit (Operation not permitted)
(13)Permission denied: make_sock: could not bind to address 0.0.0.0:80
no listening sockets available, shutting down
Unable to open logs
Action 'start' failed.
The Apache error log may have more information.

It is unlikely to be a big crisis. Looks like you don’t have permissions. Apache needs to be started by root user. Are you root? Try

sudo service apache2 restart

You’re welcome :p

How can tweets be archived for proof

Suhel Seth being sued by ITC and subsequently deleting some tweets raises the question of how tweets can be used as proof after deletion. The old method of screenshots raises questions of tampering and linking to the tweet is no use once it is deleted.

Suhel Seth being sued by ITC and subsequently deleting some tweets raises the question of how tweets can be used as proof after deletion. The old method of screenshots raises questions of tampering and linking to the tweet is no use once it is deleted.

Suhel Seth being sued by ITC and subsequently deleting some tweets raises the question of how tweets can be used as proof after deletion. The old method of screenshots raises questions of tampering and linking to the tweet is no use once it is deleted.

Suhel Seth being sued by ITC and subsequently deleting some tweets raises the question of how tweets can be used as proof after deletion. The old method of screenshots raises questions of tampering and linking to the tweet is no use once it is deleted.

http://storify.com/vidyut/how-can-tweets-be-archived-for-proof