Current's April Simpson HERE.
Monday, June 6, 2016
WYEP's successful fundraising featured in The Current Magazine
Current's April Simpson HERE.
Sunday, April 24, 2016
913 Songs, 913 Tweets (almost) - Buffer/Zapier/Twitter: what worked, what didn't
Aka "How to Countdown 913 awesome songs and tweet about, with photos, from a spreadsheet."
Our listeners voted on the top 913 songs, which we then scheduled to play throughout the week as a countdown.
A while back I set up a twitterbot to tweet out songs that are now playing on-air. WYEP DJ Joey Spehar hit me up on Hipchat, and wanted to know if we could adjust the tweets to include the countdown #:
'Here's my thinking for the countdown tweets: '#374 The Flaming Lips "Race For The Prize" || #913Countdown'
Simple request...
The thing is, our twitterbot is a hacked together mashup. A prototype really. It pulls information from an NPR Digital Composer playlist feed. Our Drupal CMS ingests the feed, and posts to Twitter.
Unfortunately, the place in the countdown isn't included anywhere in that pipeline. What to do?
When I got the countdown list from our Programming Director, it looked like this:
I have pretty much all the info I need. I added "for the song title in column F, and added a concat function:
=CONCATENATE("#",E2," ",C2," ", F2, D2,F2," || #913Countdown")
Which gives me the output:
#913 Grateful Dead "Casey Jones" || #913Countdown
How do I get these tweets scheduled for Twitter from my spreadsheet?
As it turns out, there is a great tool called Zapier, which can connect a Google Spreadsheet to your Buffer account.
The gist of our Zapier setup:
The Trigger: When a row is updated in the Google Spreadsheet
The Action: Add to Buffer Schedule
The text source is the column containing the results of my concatenate function above. For the schedule template I used the two columns to build a human readable date: Wed 04/13/2016 10:01am -1h
The -1h (minus 1 hour) was necessary to get the time right... not sure why since the timezone was set to East Coast. (don't forget to keep a space before "-1h")
And presto... 913 songs scheduled to tweet! Only... our level of Buffer account only allows 200 scheduled at a time. Also, anything over 99 sends you an email to confirm that you want to do such a large job.
But how did you get all those photos in there?
I later noticed that you can add photos based on a column in your spreadsheet. There was a quirk though... and I don't quite remember what it was. For some reason, just adding a URL in the column wasn't working. Instead, I wound up building the url in the Zapier template. We used: http://bit.ly/(google sheet column).
Our awesome interns then proceeded to curate photos and enter appropriate bitly codes into the spreadsheet.
This mostly worked!
What didn't work: Tweets with photos larger than 3mb would fail. There was no email notification that they failed, so I didn't notice until I checked into Buffer. It would have been really great to get a notification when the tweet was scheduled, instead of waiting for it to fail.
Also, nearing the end of our countdown we got the sad news that Prince had died. There was no way we could carry on counting down without taking a moment to celebrate the musical force that Prince was.
It was the last day of the NAB Show, and I was just getting ready to check out of our hotel. But first, I thought I would check in on the countdown, and finish up the scheduling.
Our marketing coordinator mentioned something in Hipchat about preparing messaging about Prince, just in case. What? Eeek. Not Prince! I looked up the info... AP had verified that Prince had died.
I tuned into our stream, and we were playing Purple Rain. Odd... Purple Rain was indeed on our countdown, but it wasn't at this spot. Oh no... we are going wall to wall Prince. I mean awesome, I love Prince. But...
Must. Stop. The. Tweet. Train.
Deleting, editing, or rescheduling a tweet in Buffer is relatively easy. Deleting 200 tweets is not. It requires a hover, click, confirm for each. Man, would checkboxes be nice. I found a bit of a shortcut, where I could tap my touchscreen where the delete button would be, and it would jump to the confirm dialog which was always in the same spot. It was like a Twisted Candy Crush.
We managed to only publish a couple of countdown songs ahead of their actual playtime, and were able to resume the countdown without too much trouble.
All in all, the tweet scheduling was a success. There was a big increase in impressions and engagement through the countdown. This was a good way to provide an additional level of digital content to WYEP listeners, allowing them to share in a sense of community and conversation celebrating music.
Tuesday, April 19, 2016
Steady Streaming on a Shoestring Budget
Background
91.3 FM WYEP, an independent AAA format radio station based in Pittsburgh Pennsylvania, needed to consider updating our streaming methods.
Urgent Need
We were getting reports from listeners that the stream was cutting out, and it was sometimes not working at all.
We were running on a relatively inexpensive ($280 / mo) service from a company called Streamguys.
This provided up to 450 concurrent listeners an Icecast based stream.
We had access to a rudimentary log analyzer that generated relatively opaque reports. Unfortunately we had no monitoring system in place to let us know if we were in danger of reaching out limit, or over it.
We simply did not have the tools to monitor and react to the streaming needs of our listeners in real time.
Fresh Eyes
I had no experience in broadcasting before I joined WYEP in December, 2015.
My background is in product management, web development, and content management. I've worked in software development and for scrappy budget constrained startups.
So I asked: can't we serve our own stream?
I had (and have) a lot to learn. And since I work for a non profit, I am happy to be able to share our methods and results!
Technology Stack
Summary:
The Barix Instreamer converts the audio from our mixing board to a 320kbps CBR Shoutcast stream. We already had this lying around. This could easily be swapped with a cheap computer running ffserver.
We chose to use Ubuntu 14.04 LTS to install Wowza and ffserver on. It's fast, secure and stable, and LTS stands for Long Term Service. It's also a tool I'm familiar with and can easily find documentation to make it do what I want. All of this could be done with Windows if you wanted.
The interesting bits
Wowza is a great piece of software. It has its quirks, but overall it does a fantastic job. The customer support is fantastic, and there is seamless integration with Wowza Cloud. As long as you have your firewall set properly... but more on that in another post.
The main thing Wowza gives us is the ability to stream in HLS. Not only does HLS sound better at 128kbps than our prior stream, it also allows us to take advantage of a sexy p2p service I came across, Peer5.
Peer5 can enable us to provide an increase of 80% or more service to listeners by using P2P. It's like Bittorrent for streaming. Here's a typical dashboard snapshot for one day:
The blue bars in the top left indicate bandwidth that is being served by P2P instead of direct or CDN.
With all of these fancy new technologies, we still had holes in our service. During my research I discovered that the BBC went through a similar transition about a year ago. Change is disruptive, and can make consumers angry. What could we learn from the BBC? What could we improve on?
Can't hear us? We are listening.
We have decided to adopt a "No Listener Left Behind" policy. What that means in practical terms is: we will do everything we can to support our listeners ability to tune in.
Which brings us to ffserver (part of ffmpeg). This open source tool has been around for years, and is a Swiss army knife for audio and visual formats. It's fast, tried, and true. We are using this to serve up mp3 and Windows Media Player compatible streams, among others. This helps us cover the bases for players not supported by Wowza.
Now let's talk about scaling. Wowza Cloud offered to waive the $35 monthly fee for 12 months (a bargain anyway), which comes with 1 free terabyte of streaming. After that the services offered through their partnership with Akamai provide very affordable and rock solid scaling capability. Adding this to our mix was a no-brainer.
Practical Application
After making the switch, we are serving a peak of over 300 concurrent listeners(as of this writing, and growing). About 250 of those are consuming the HLS stream, and about half of those are enhanced by P2P through Peer5.
For listeners who come in through our website, we are able to scale with Wowza Cloud to whatever Akamai can handle. That's big. We are also using Wowza Cloud to serve streams to listeners behind restrictive firewalls. There are more elegant solutions, but this works for us right now, and means I can focus on our more pressing digital needs. We currently use less than half of our free 1TB per month.
Disruption
Sadly, some listeners have not been able to tune in for a few days. They are on smart TVs or internet radios, hardware that gets our stream URL from a service like Tunein or Vcast. We are updating as many of these as quickly as we can, but the process has not gone as quickly as we would like. Still, it's a small number of listeners who are affected by this. Every week I get about 2 "your stream stopped working since the update" emails. I promptly chase down the service that has our outdated stream info and send them the latest.
Reports
We need to find a solution for consolidating reports. We now have Wowza Cloud, and two servers in our VM. Peer5 also has a dashboard. I have to go to 5 different places to get a picture of our streaming metrics (if you add in Google Analytics).
Results... so far
91.3 FM WYEP, an independent AAA format radio station based in Pittsburgh Pennsylvania, needed to consider updating our streaming methods.
Urgent Need
We were getting reports from listeners that the stream was cutting out, and it was sometimes not working at all.
We were running on a relatively inexpensive ($280 / mo) service from a company called Streamguys.
This provided up to 450 concurrent listeners an Icecast based stream.
We had access to a rudimentary log analyzer that generated relatively opaque reports. Unfortunately we had no monitoring system in place to let us know if we were in danger of reaching out limit, or over it.
We simply did not have the tools to monitor and react to the streaming needs of our listeners in real time.
Fresh Eyes
I had no experience in broadcasting before I joined WYEP in December, 2015.
My background is in product management, web development, and content management. I've worked in software development and for scrappy budget constrained startups.
So I asked: can't we serve our own stream?
I had (and have) a lot to learn. And since I work for a non profit, I am happy to be able to share our methods and results!
Technology Stack
Summary:
- 75mbps FiOS
- Barix Instreamer
- Ubuntu 14.04 (running in a virtual machine)
- Wowza Engine
- Wowza Cloud
- ffserver
- Peer5
The Barix Instreamer converts the audio from our mixing board to a 320kbps CBR Shoutcast stream. We already had this lying around. This could easily be swapped with a cheap computer running ffserver.
We chose to use Ubuntu 14.04 LTS to install Wowza and ffserver on. It's fast, secure and stable, and LTS stands for Long Term Service. It's also a tool I'm familiar with and can easily find documentation to make it do what I want. All of this could be done with Windows if you wanted.
The interesting bits
Wowza is a great piece of software. It has its quirks, but overall it does a fantastic job. The customer support is fantastic, and there is seamless integration with Wowza Cloud. As long as you have your firewall set properly... but more on that in another post.
The main thing Wowza gives us is the ability to stream in HLS. Not only does HLS sound better at 128kbps than our prior stream, it also allows us to take advantage of a sexy p2p service I came across, Peer5.
Peer5 can enable us to provide an increase of 80% or more service to listeners by using P2P. It's like Bittorrent for streaming. Here's a typical dashboard snapshot for one day:

The blue bars in the top left indicate bandwidth that is being served by P2P instead of direct or CDN.
With all of these fancy new technologies, we still had holes in our service. During my research I discovered that the BBC went through a similar transition about a year ago. Change is disruptive, and can make consumers angry. What could we learn from the BBC? What could we improve on?
Can't hear us? We are listening.
We have decided to adopt a "No Listener Left Behind" policy. What that means in practical terms is: we will do everything we can to support our listeners ability to tune in.
Which brings us to ffserver (part of ffmpeg). This open source tool has been around for years, and is a Swiss army knife for audio and visual formats. It's fast, tried, and true. We are using this to serve up mp3 and Windows Media Player compatible streams, among others. This helps us cover the bases for players not supported by Wowza.
Now let's talk about scaling. Wowza Cloud offered to waive the $35 monthly fee for 12 months (a bargain anyway), which comes with 1 free terabyte of streaming. After that the services offered through their partnership with Akamai provide very affordable and rock solid scaling capability. Adding this to our mix was a no-brainer.
Practical Application
After making the switch, we are serving a peak of over 300 concurrent listeners(as of this writing, and growing). About 250 of those are consuming the HLS stream, and about half of those are enhanced by P2P through Peer5.
For listeners who come in through our website, we are able to scale with Wowza Cloud to whatever Akamai can handle. That's big. We are also using Wowza Cloud to serve streams to listeners behind restrictive firewalls. There are more elegant solutions, but this works for us right now, and means I can focus on our more pressing digital needs. We currently use less than half of our free 1TB per month.
Disruption
Sadly, some listeners have not been able to tune in for a few days. They are on smart TVs or internet radios, hardware that gets our stream URL from a service like Tunein or Vcast. We are updating as many of these as quickly as we can, but the process has not gone as quickly as we would like. Still, it's a small number of listeners who are affected by this. Every week I get about 2 "your stream stopped working since the update" emails. I promptly chase down the service that has our outdated stream info and send them the latest.
Reports
We need to find a solution for consolidating reports. We now have Wowza Cloud, and two servers in our VM. Peer5 also has a dashboard. I have to go to 5 different places to get a picture of our streaming metrics (if you add in Google Analytics).
Results... so far
- increased insight in real time
- greater flexibility
- higher quality audio
- ability to instantly scale on demand
- reliability
- reduced costs
Subscribe to:
Comments (Atom)