I have been really encouraged by the availability of images which have been released under Creative Commons licenses.
While there are a lot of icon sets out there, here are some of my "go to" places.
The first place I usually go for free icons isthenounproject.com. There is a growing community behind the endeavor and their management operations are being taken seriously.
Bush house: the BBC World Service is leaving its home after 71 years Photo: Paul Grover via The Telegraph
There has recently been some discussion on the about the BBC selling its production facilities and moving from the Bush House to somewhere else. [1]Christopher Middleton. 7:30 am BST 10 Jul 2012. For sale: Bush House, a landmark of BBC World Service history. The Telegraph on-line. … Continue reading[2]Jonathan Prynn. 11 July 2012. Buy a bit of BBC radio history… or an entire studio. London Evening Standard on-line. … Continue reading[3]Paul Ridden. 12:41 pm 12 July 2012. Updated: BBC World Service equipment and memorabilia to go under the auctioneer's hammer. gizmag online. … Continue reading The BBC world service has been a major player in radio and oral culture in Great Britain and around the world for 71 years. A lot of history has been reported by the service. And the BBC's records (including its archive) have oral histories of a variety of world events for the last 71 years in a variety of languages (Wikipedia has a brief description of the collections at the BBC.). Continue reading →
Christopher Middleton. 7:30 am BST 10 Jul 2012. For sale: Bush House, a landmark of BBC World Service history. The Telegraph on-line. http://www.telegraph.co.uk/culture/tvandradio/bbc/9386848/For-sale-Bush-House-a-landmark-of-BBC-World-Service-history.html [Link] [Accessed: 19 July 2012]
Jonathan Prynn. 11 July 2012. Buy a bit of BBC radio history… or an entire studio. London Evening Standard on-line. http://www.standard.co.uk/news/uk/buy-a-bit-of-bbc-radio-history-or-an-entire-studio-7935734.html [Link] [Accessed: 19 July 2012]
Paul Ridden. 12:41 pm 12 July 2012. Updated: BBC World Service equipment and memorabilia to go under the auctioneer's hammer. gizmag online. http://www.gizmag.com/bbc-world-service-bush-house-auction/23292/ [Link] [Accessed: 19 July 2012]
This post is a open draft! It might be updated at any time… But was last updated on at .
In this reviewRegardless of the views expressed here in this review, it should be stated that I have high hopes for Webonary’s future. Some of the people working on Webonary are my colleagues so I attempt hedge my review with the understanding that this is not the final state of Webonary. I am excited that easy to use technology, like WordPress is being used, and that minority language groups around the world have the opportunity to use free software like webonary. I will be looking at the WordPress plugin, Webonary and several associated issues. Continue reading →
I run a website, wycliffe.me, for redirecting traffic (URL redirector). But I need it to have a CRM sort of component to it. So I added some custom fields to the Posts using Just Custom Fields. (I am using Posts, but I could just as well use a custom post type Custom Post Type UI.) But now I want a summary of some of those fields in a special panel on the back-end. So I have collected some links to read and start hacking.
First I need to create an options page in the admin area: http://buildinternet.com/2010/01/create-custom-option-panels-with-wordpress-2-9/.
Next I need a way to collect the data. So I look for a plugin which can search my database and return fields…. sorta like views for Drupal. And wala there is such a plugin: Query Wrangler. (Query Posts might be another option, but I did not try it.) However, this plugin is not powerful enough. I can not search all the fields created by my other plugins, only my custom fields and content types. More power would be ideal.
I am all for OpenData and Open.NASA. But how does NASA being a government entity relate to how it “licenses” it’s data and software? What I mean is that, shouldn’t the things being “open sourced” be public domain, rather than licensed content? I agree that creating a license which is not widely recognized is not useful, that is the whole point behind Creative Commons. But are there cases where NASA is “over licensing” content that it shouldn’t because it is the content should be released into the public domain? Reference CC Salon in Jan 2011, Time segment 1:05:00 where Joi Ito talks about the issue. http://blip.tv/creative-commons/creative-commons-salon-mountain-view-what-does-it-mean-to-be-open-in-a-data-driven-world-4725230
What prevents, or what reasons are there for not putting NASA’s data and software, which it releases, in the public domain? Is that not more open?
Because I have been on the team doing the SIL.org redesign, I have been looking at the Open Source landscape looking at what is available to connect Drupal with DSpace data stores. We are planning on making DSpace the back-end repository, with another CMS running the presentation and interactive layers. I found a module which parses DSpace's XML feeds in development. However, this is not the only thing that I am looking at. I am also looking at how we might deploy Omeka. Presenting the entire contents of a Digital Language and Culture Archive, and citations for their physical contents is no small task. In addition to past content there is also future content. That is to say archiving is also not devoid of publishing - so there is also the PKP project [sic redundant]. (SIL also currently has a publishing house, whose content need CSV or version control and editorial workflows, which interact with archiving and presentation functions.)
Omeaka
Wally Grotophorst has a really good reflection on Omeaka and DSpace, I am not sure that it is current but it does present the problem space quite well. [1]Wally Grotophorst. 4 March 2008. DSpace And Omeka. iNODE: The weblog of Digital Programs and Systems at George Mason University Libraries. http://timesync.gmu.edu/wordpress/?p=485 . [Accessed: 26 … Continue reading Tom Scheinfeldt at Omeka also has a nice write up on why Omeka exists, titled "Omeka and It's peers". It is really important to understand Omeka's place in the eco system of content delivery to content consumers by qualified site administrators. [2] Tom Scheinfeldt. 21 September 2010. Omeka and It's peers. http://omeka.org/blog/2010/09/21/omeka-and-peers/ [Accessed: 26 November 2011] [Link] [Also Posted on Tom's Blog]
@Mire talks about What DSpace could learn from Omeka. [3] @Mire. 20 May 2010. What DSpace could learn from Omeka. http://www.facebook.com/notes/mire/what-dspace-could-learn-from-omeka/393758568767 . [Accessed: 26 November 2011] [Link]
Dspace Mailing list discussion discussing some DSpace technologies for mixing with OAI-ORE and Fedora, Omeka, and Drupal.
Wally Grotophorst. 4 March 2008. DSpace And Omeka. iNODE: The weblog of Digital Programs and Systems at George Mason University Libraries. http://timesync.gmu.edu/wordpress/?p=485 . [Accessed: 26 November 2011] [Link]
Tom Scheinfeldt. 21 September 2010. Omeka and It's peers. http://omeka.org/blog/2010/09/21/omeka-and-peers/ [Accessed: 26 November 2011] [Link] [Also Posted on Tom's Blog]
@Mire. 20 May 2010. What DSpace could learn from Omeka. http://www.facebook.com/notes/mire/what-dspace-could-learn-from-omeka/393758568767 . [Accessed: 26 November 2011] [Link]
I set up another WordPress site and I wanted to transfer what I had written there to this site, so that all my writings would be together. This would include comments, links and attached media, and metadata about the post.
What I want a transfer plugin to do.
So I looked for a WordPress Plugin to do that. I found two (and as it is when I find more than one I have to test it out and write-up the results.):
Xpost: Cross-post was the first plugin I found and it seemed to have a lot of really nice features.
Transfer: the main difference between the two based on author description is that this one said that it also transferred images attached with the post.
So I tried Transfer first.
Transfer
However, when I installed Transfer, it said that it could not find the Zend Framework… Warning: require_once(Zend/XmlRpc/Client.php) [function.require-once]: failed to open stream: No such file or directory in /home1/public_html/username/wordpress/wp-content/plugins/transfer/library/Aperto/XmlRpc.php on line 3Path values changed to protect the innocent.
The plugin requires that one download Zend Framework Minimal (http://framework.zend.com/download/latest) and put the Zend folder under /wp-content/plugins/transfer/library/
I did this and I would get the WordPress white screen of death. I was told that this white screen of death was because my provider terminated a process (I had maxed out my user’s memory allocation) This white screen happens on one of my installs but not on another under a different user… so, not sure what is going on – Neither WP install would transfer the post. To get around the white screen of death I had to de-activate the plugin by editing the database.
I had initially failed to read the install requirement for Zend, so I found another solution for adding Zend to wordpress.
So I knew I needed to install the Zend Framework, I am sorta surprised that Dreamhost, my hosting provider did not have Zend set up on my server in a way that WordPress was automatically going to detect it. Oh, well is there a plugin for that? – Uh, yes there are like a gilzillion! So I went with the first one: Zend Framework [or also in WP-Extend]. I loaded it and then added the helpful code found in the online WordPress forums.
Go to your wp-config and paste this right after * @package WordPress part and before // ** MySQL settings – You can get this info from your web host ** //
/** Zend Framework **/
function add_include_path ($path)
{
foreach (func_get_args() AS $path)
{
if (!file_exists($path) OR (file_exists($path) && filetype($path) !== 'dir'))
{
trigger_error("Include path '{$path}' not exists", E_USER_WARNING);
continue;
}
After I did both of these things all of my errors went away.
I did try a second plugin, WP-ZFF Zend Framework Full for installing the Zend Framework, this one said that the plugin would modify the include path so I thought that could use this without modifying wp-config.php but the plugin failed on import so I deleted it.
So in the sad case I that I was not able to get Transfer to work I moved on and decided to try Xpost.
Xpost
Xpost [on WP-Extend] was a breeze to set up and I actually got it working for a simple post. However, I was not able to select the target category in the master WP install, from the writer’s WP install (The test post I used just when to the default category).
Xpost not getting categories available on the master WP install.
The box just says categories loading. This seems to be a problem reported by Nigel and by gulliver.
The test image was not transferred to the media library of the master WP install from the writer’s WP install. Additionally, if the category of the post is changed in the Master WP install, then the writer’s WP install loses track of the post. This is only temporary… If the writer tries to cross-post the post again, then the This results in the writer not being able to update the post. (Red error message is shown.) But if the writer tries a second time then the original post in the Master WP install is found, and updated. Including the “removed” category. However, this “removed” category was intentionally moved by the editor on the Master WP install. So this creates a bit of a conflict. BTW: It would be nice to be able to select a special custom post type for imports.
It seems that Xpost was designed to broadcast out rather that to ingest.
I use MAMP for my local test environment. But I have recently moved beyond just PHP apps. I am also looking at using Tomcat. I would like to mess around with DSpace locally and use Solr also. But I have found a couple of helps for adding things to MAMP.
Drush: I also want Drush for working with Drupal. But this does not need to live in the MAMP folder. I just don’t know where else is safe. (I should have more on Drush later.)
One of the problems I am facing is that I really like apps like MacPorts. But I do not want to tinker with the CORE and default setting of my OS X machine. So I find that MAMP is a good alternative, but I can not type a command in the command line and have all the dependencies download automatically. I recently found that I could do something like this with Homebrew…. Never used it before but it looks to be the tool for the job. So I have collected a few tutorials like: installing php5.3, Using an gmail as a smtp server, and setting up solr.
Jetpack is in no way new… But I have never installed it (it seems that half a million other people have though). The only service I have used from Automattic is akismet. Then about a month ago I installed after the dead line as a Google Chrome plugin to help me with my spelling mistakes. It seemed to work so I thought I would give it a go as a WordPress plugin.
What was new was that I had not integrated a sharing solution for readers of my blog. So as of now there is a share this option at the end of my posts.
Sharing options
Of course Sharedaddy, the sharing plugin did not have a Google +1 sharing option, nor del.ic.ious sharing option. So I had to find some solutions. I found a fork of Sharedaddy on github which had added Google+ and LinkedIn. (I am not on Google+ but I just joined LinkedIn last week as I was redoing my resume).
To add delicious I followed a post by Ryan Markel to find the right share service URLs.
Menus
The other thing I figured out this week was how to use the Menus Feature under the Appearance tab. I have been using K2 since 2005 and have always thought that the menus in the default theme were sufficient. I have usually not had complex menu desires. So there was no real need to learn these new features, however. Now I wanted to put several picture pages under the same menu. So wal-la. It is done now.
New menu settings
Others
(Mostly RDFa and HTML5)
I also have a plugin that is adding Open Graph RDFa tags to my theme. My current version of K2 is HTML5 but, it is not validating with the RDFa tags in it. So I was trying to validate them but have not been successful. I looked at this answer which said to add something to the doctype. But then there is more answers too. Sometimes these answers are beyond me. I which I had some structured learning in this subject area.
Why RDF?
And RDFa is the basis of Open Graph, the technology used to sync FaceBook Likes between my site and FaceBook.
I watched the State of the Word address by Matt. There are some very exciting things happening with WordPress. It is always interesting to think that WordPress and FaceBook are almost the same age, they have both had a significant effect on the internet landscape.
In his State of the Word Speech, Matt mentioned that plugins which have not been updated in two years will be removed from the search results on WordPress.org/extend. My question is:
Why choose two years? Why not choose votes of “it doesn’t work” for the past two full points on the development cycle?
So if WordPress 3.2 is the newest release of WordPress, then all plugins which are not voted to have worked on at leasts 3.0 and above would get removed from the search results. With just 2 points on the development cycle it would probably be less time than two years. So, what Matt is proposing is probably a more lenient strategy. By my question is not about what time depth but rather why time depth. Why choose time depth rather than the dynamic that an audience says something is working with the current version of WordPress?