[Deprecated] tobbenb's Docker Template Repository - WebGrab+Plus


Recommended Posts

Hi Monkeyair,

 

Do you have 'one' of the Dual/Quad TBS cards ? I see they are circa £100 plus...

 

I wonder if anyone has successfully used more than 1 identical device in the same set-up ?

 

I am ideally looking for a 4 (quad) tuner set-up so as to prevent the occasional blocking due to schedule conflicts, having HD would probably be a bonus but not sure how the Raspberry Pis would handle this or may relatively basic PC set up so it's not a deal breaker...

 

I'm so impressed with the functionality available it's just getting it tweaked to deal with a few rough edges in my install..

 

All the best

 

Nigel

 

At the minute I use 2 x TBS6205, giving 8 tuners.  They work well now after some teething problems (Thanks to Saarg and CHBMB for the help).  I'm looking to change one of them in the future for a TBS DVB-S one to give all of the UK 'free' channels.

 

My setup uses several Raspi 2's and they stream the TV and recordings with no probs at all.

Link to comment
  • Replies 772
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

EDIT:  NM...found the conf file and copied it to the appropriate folder.

 

Tried to read most of this but didn't see what I was looking for...

 

How do you create the conf file for tv_grab_na_dd?  I was able to do it within the container but have no idea where it wrote the file.

 

John

Link to comment

I take that back.  I thought I had things working as it is pulling the channels but no data:

 

Jan 31 12:10:48 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab /usr/local/bin/tv_grab_na_dd
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: Executing "/usr/local/bin/tv_grab_na_dd"
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: using config filename /nonexistent/.xmltv/tv_grab_na_dd.conf
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: Got connection from 192.168.1.60
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60: Welcomed client software: Kodi Media Center (HTSPv18)
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ Kodi Media Center ]: Identified as user kodi
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ kodi | Kodi Media Center ]: Privileges raised
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: Fetching from Schedules Direct Fetched 8116 k/bytes in 7 seconds
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: loading data
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: NOTE: Your subscription will expire: 2016-03-21T10:46:27Z
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: Writing schedule
Jan 31 12:11:34 unRAID tvheadend[19]: spawn: Downloaded 16683 programs in 46 seconds
Jan 31 12:11:34 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab took 46 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: parse took 1 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: channels tot= 72 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: brands tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: seasons tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: episodes tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: broadcasts tot= 0 new= 0 mod= 0

 

Maybe I created the conf file incorrectly?  I exec'd into the container and ran "tv_grab_na_dd --configure".  The resulting file was written to /root/.xmltv (in the container).  I copied that file to /config/.xmltv.

 

Did I miss a step or misconfigure the conf file?

 

John

Link to comment

Thanks Saarg and Monkeyair for the quick responses.

 

I'll try disconnecting each of the USB devices in turn to see if it is connecting to both or just one. I have managed to use both  simultaneously using a windows app on the same machine previously but was attracted to the unRAID / Docker setup. I have been using unRAID as a NAS for 5 years plus and have another machine dedicated for that. This is more of a pre-production set-up to get the PVR function stable enough to get SWMBO approval... and then build on functionality with a few other Dockers....

 

I did download the plugin MediaTreeCheck and it shows:

Media Tree: DETECTED

DVB Adapter: DETECTED

 

I have attached the Syslog which I hope is useful. I did look through it and saw at least one of the receivers being detected but it's not clear to me if both are seen.... Disadvantage of two identical tuners ?

 

My ambition is to add a third USB tuner if possible so I can access all my local muxes simultaneously (No blocking of recordings of any channel across all DVB-T, not bothered about DVB-T2 just yet). I had considered the HDHomerun as it seems more robust than USB but they are not readily available in the UK and cost is always an issue...

 

Thanks again, I do some more testing of the H/W and update later today.

 

Best Regards

I only see one USB device recognised by Unraid. Have you tried to put them each on their own usb bus?

If that doesn't help, could you boot unraid and wait for it to finish booting and rest a little bit, then insert one of the USBs, save the syslog, then insert the second USB and save the syslog?

 

Link to comment

I take that back.  I thought I had things working as it is pulling the channels but no data:

 

Jan 31 12:10:48 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab /usr/local/bin/tv_grab_na_dd
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: Executing "/usr/local/bin/tv_grab_na_dd"
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: using config filename /nonexistent/.xmltv/tv_grab_na_dd.conf
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: Got connection from 192.168.1.60
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60: Welcomed client software: Kodi Media Center (HTSPv18)
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ Kodi Media Center ]: Identified as user kodi
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ kodi | Kodi Media Center ]: Privileges raised
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: Fetching from Schedules Direct Fetched 8116 k/bytes in 7 seconds
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: loading data
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: NOTE: Your subscription will expire: 2016-03-21T10:46:27Z
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: Writing schedule
Jan 31 12:11:34 unRAID tvheadend[19]: spawn: Downloaded 16683 programs in 46 seconds
Jan 31 12:11:34 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab took 46 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: parse took 1 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: channels tot= 72 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: brands tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: seasons tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: episodes tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: broadcasts tot= 0 new= 0 mod= 0

 

Maybe I created the conf file incorrectly?  I exec'd into the container and ran "tv_grab_na_dd --configure".  The resulting file was written to /root/.xmltv (in the container).  I copied that file to /config/.xmltv.

 

Did I miss a step or misconfigure the conf file?

 

John

I think you have to map the epg source to the channels before the data is loaded. Do that and restart the container. Be sure to close the gui as it doesn't load the epg if it's open.

Link to comment

I take that back.  I thought I had things working as it is pulling the channels but no data:

 

Jan 31 12:10:48 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab /usr/local/bin/tv_grab_na_dd
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: Executing "/usr/local/bin/tv_grab_na_dd"
Jan 31 12:10:48 unRAID tvheadend[19]: spawn: using config filename /nonexistent/.xmltv/tv_grab_na_dd.conf
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: Got connection from 192.168.1.60
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60: Welcomed client software: Kodi Media Center (HTSPv18)
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ Kodi Media Center ]: Identified as user kodi
Jan 31 12:10:50 unRAID tvheadend[19]: htsp: 192.168.1.60 [ kodi | Kodi Media Center ]: Privileges raised
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: Fetching from Schedules Direct Fetched 8116 k/bytes in 7 seconds
Jan 31 12:10:56 unRAID tvheadend[19]: spawn: loading data
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: NOTE: Your subscription will expire: 2016-03-21T10:46:27Z
Jan 31 12:11:18 unRAID tvheadend[19]: spawn: Writing schedule
Jan 31 12:11:34 unRAID tvheadend[19]: spawn: Downloaded 16683 programs in 46 seconds
Jan 31 12:11:34 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab took 46 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: parse took 1 seconds
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: channels tot= 72 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: brands tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: seasons tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: episodes tot= 0 new= 0 mod= 0
Jan 31 12:11:35 unRAID tvheadend[19]: /usr/local/bin/tv_grab_na_dd: broadcasts tot= 0 new= 0 mod= 0

 

Maybe I created the conf file incorrectly?  I exec'd into the container and ran "tv_grab_na_dd --configure".  The resulting file was written to /root/.xmltv (in the container).  I copied that file to /config/.xmltv.

 

Did I miss a step or misconfigure the conf file?

 

John

I think you have to map the epg source to the channels before the data is loaded. Do that and restart the container. Be sure to close the gui as it doesn't load the epg if it's open.

 

I found the issue.  In moving from WG+ to XMLTV to whatever, I ended up with 3 entries for every channel in the EPG source pick list.  I was just picking the wrong one.  :S

 

I just burned everything to the ground and starting over.

 

John

Link to comment

OK...next question...

 

Does anyone else other than myself have multiple lineups in the schedules direct account?  I have 2 due to the fact the 2 of my channels are not included in my regular SD lineup (19380) even though I can reach them with my antenna.  So I added a second lineup (08360) that has data for these channels (62.1 and 62.2).

 

Apparently, tv_grab_na_dd has multiple lineup support but I'm not exactly sure how it could be incorporated.  I found the info here:  http://manpages.ubuntu.com/manpages/lucid/man1/tv_grab_na_dd.1p.html#contenttoc9

 

Handling Multiple Linups

      tv_grab_na_dd only outputs a single lineup. If your Schedules Direct

      account has multiple lineups, they are all downloaded even though only

      one is output.

 

      To process multiple lineups, use separate --config-file.  Separate

      config files are also handy if you need different channel sets for a

      lineup (common with MythTV). To prevent re-downloading the data on

      subsequent passes, the "--reprocess" option is recommended.

 

      Here’s an example: (the = sign is optional, but helps readability)

 

        tv_grab_na_dd --config-file=lineup1.dat --output=lineup1.xml --dd-data=dd.xml

        tv_grab_na_dd --config-file=lineup2.dat --output=lineup2.xml --dd-data=dd.xml --reprocess

        tv_grab_na_dd --config-file=lineup3.dat --output=lineup3.xml --dd-data=dd.xml --reprocess

 

      Each config file specifies the desired lineup and channel list.

 

      If you want to merge the lineups into a single file, you can use tv_cat

 

        tv_cat lineup1.xml lineup2.xml lineup3.xml >guide.xml

Link to comment

I'm going to throw this out there and see where it goes.  :)

 

Many thanks to saarg for including the tv_grab_na_dd module.  It does exactly what it is designed to do and does it FAST (where WG+ is brutally slow...almost unusable).  The only issue is that it only works with SD and SD requires a paid subscription.  BOO!

 

Enter zap2xml (http://zap2xml.awardspace.info/).

 

I know a few of you are familiar with mc2xml.  zap2xml is basically the same thing except it scrapes data from the zap2it.com website (the same exact place where SD gets their data).  Sounds great, right?  Here is the issue:  The only files available for download are a Windows EXE and a Perl script.

 

Naturally, the Windows EXE does us absolutely no good here if your intention is to have a completely self-contained tvheadend system that can grab EPG data on its own.  So, we are left with the Perl script.  I had read a few articles where users where able to use it without issue on the tvheadend box but this is not the case on saarg's container.  When I EXEC into the container and attempt to run the script, it yells at me about missing dependencies or perl modules or something.

 

So, question to saarg:  Any interest in investing some time to see if we can get the perl script to work?

 

If not, no worries.  I am currently running the EXE version from my desktop twice a day via Task Scheduler.  I actually have it scrape data for 2 different zip codes (see my question in the post previous to this one) and then merge the 2 outputs together and dump it to \\unraid\Docker\tvheadend\data\guide.xml.  All of this is done about 15 minutes before the normal tvh crob job to import new EPG data (twice a day).

 

While the above works, it sure would be nice to have a more elegant solution.  :)

 

John

Link to comment

I think I actually did it!  :)

 

I have 2 containers for zap2xml perl script:

 

One is simply called zap2xml and is for users who need to scrape only one zipcode on zap2it.com.

 

The other is called zap2xml-2lineups.  This is for users (like me) who need to scrape guide data for 2 different zipcodes to cover all of their channels.  This requires 2 unique accounts/configurations on zap2it.com.  The resulting XMLs that are generated are merged into a single one at the end (guide.xml).

 

Docker Reporisitory Templates:  https://github.com/johnodon/unraid-templates

 

I tried to put as much instruction in the template as possible.  Be warned that I have not yet added the cron job to run the scrape every 12 or 24 hours.  In the meantime, just restart the zap2xml container every so often and your guide data will be refreshed.

 

Major props to Chis Pollard and saarg for helping me along!  With any luck saarg will be able to incorporate this directly into his TVH container.  :)

 

Please feel free to provide feedback or ask questions.

 

John

Link to comment

Ok, I don't normally ask this, but can someone here hold my hand on getting EPG working with this (specifically Schedules Direct)?

 

I'm coming from MythTv Docker, which is all TOO easy to setup for S.D., however would like to use some of the new PVR features with Kodi for Jarvis.

 

I'm using the unstable version, have my SD listings setup already.

I initially setup WG+, and it works fine, dumps data into the guide.xml file.

However I can't get TVH unstable to read it.

Also, even though I have the OTA EPG disabled everywhere I can find the option, it polls my 5 tuners at startup for quite a while pulling down guide data!

 

Anyhow, it sounds like tv_grab_na_dd would work well, and others here use it.

So I pick through the info here and figure out how to attach to the docker.

I run the tv_grab_na_dd --configure and it doesn't seem to do anything.

So then I just type tv_grab_na_dd and I get the following....

2016-02-04 18:18:53.000 [   INFO] /usr/local/bin/tv_grab_na_dd: grab /usr/local/bin/tv_grab_na_dd
2016-02-04 18:18:53.000 [   INFO] spawn: Executing "/usr/local/bin/tv_grab_na_dd"
Feb  4 18:18:52 Server tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab /usr/local/bin/tv_grab_na_dd
Feb  4 18:18:52 Server tvheadend[19]: spawn: Executing "/usr/local/bin/tv_grab_na_dd"
2016-02-04 18:18:53.194 [  ERROR] spawn: using config filename /nonexistent/.xmltv/tv_grab_na_dd.conf
2016-02-04 18:18:53.194 [  ERROR] spawn: *ERROR* Username not specified. Please run --configure
Feb  4 18:18:53 Server tvheadend[19]: spawn: using config filename /nonexistent/.xmltv/tv_grab_na_dd.conf
Feb  4 18:18:53 Server tvheadend[19]: spawn: *ERROR* Username not specified. Please run --configure
2016-02-04 18:18:53.197 [  ERROR] /usr/local/bin/tv_grab_na_dd: no output detected
2016-02-04 18:18:53.197 [WARNING] /usr/local/bin/tv_grab_na_dd: grab returned no data
Feb  4 18:18:53 Server tvheadend[19]: /usr/local/bin/tv_grab_na_dd: no output detected
Feb  4 18:18:53 Server tvheadend[19]: /usr/local/bin/tv_grab_na_dd: grab returned no data

 

I think I had this all working in the stable version with WG+, however maybe it was just pulling it from OTA also, not sure.

 

I normally don't ask to be spoon fed, but god damn does this seem confusing for no good reason!

Thanks for any help.  :-\

Link to comment

Ok, I don't normally ask this, but can someone here hold my hand on getting EPG working with this (specifically Schedules Direct)?

 

 

You are in the same boat I was in.  You need to create the tv_grab_na_dd.conf file.  The way i did that was to docker exec into your TVH container and run:  tv_grab_na_dd --configure

 

It will walk you through by asking questions.  The resulting .conf file will be created in /config/.xmltv (symlinked to /nonexistent/.xmltv).  You then need to tell TVH to use the XMLTV: North America (Data Direct) internal grabber.  As long as the .conf file is in the right place, the grabber will use it.

 

However, as I experienced, TVH will not appear to pull any guide data in until you actually add the EPG source for each channel.

 

Or...you can use the zap2it container I just created and stop paying SD.  :)

 

John

Link to comment

You are in the same boat I was in.  You need to create the tv_grab_na_dd.conf file.  The way i did that was to docker exec into your TVH container and run:  tv_grab_na_dd --configure

 

It will walk you through by asking questions.  The resulting .conf file will be created in /config/.xmltv (symlinked to /nonexistent/.xmltv).  You then need to tell TVH to use the XMLTV: North America (Data Direct) internal grabber.  As long as the .conf file is in the right place, the grabber will use it.

 

However, as I experienced, TVH will not appear to pull any guide data in until you actually add the EPG source for each channel.

 

Or...you can use the zap2it container I just created and stop paying SD.  :)

 

John

 

Ok, still WTF!...

Thanks BTW..  ;)

 

I'm good now, I took a break, grabbed a beer, and read over your instruction (wouldn't quite say you "held my hand" but it was the extra push I needed).

 

So (for others who have never done this), ran docker exec for my container ID (a72b17f5c851)

docker exec -i a72b17f5c851 tv_grab_na_dd --configure

 

Answered the questions, finished, back to bash prompt..

No file..  >:(

Looked again, ran it again.. No file..  ???

So, ran this

docker exec -i a72b17f5c851 cp /root/.xmltv/tv_grab_na_dd.conf /config

Now I have the tv_grab_na_dd.conf file in the root directory...

I should be able to figure out the rest.

Link to comment

I think I actually did it!  :)

 

I have 2 containers for zap2xml perl script:

 

One is simply called zap2xml and is for users who need to scrape only one zipcode on zap2it.com.

 

The other is called zap2xml-2lineups.  This is for users (like me) who need to scrape guide data for 2 different zipcodes to cover all of their channels.  This requires 2 unique accounts/configurations on zap2it.com.  The resulting XMLs that are generated are merged into a single one at the end (guide.xml).

 

Docker Reporisitory Templates:  https://github.com/johnodon/unraid-templates

 

I tried to put as much instruction in the template as possible.  Be warned that I have not yet added the cron job to run the scrape every 12 or 24 hours.  In the meantime, just restart the zap2xml container every so often and your guide data will be refreshed.

 

Major props to Chis Pollard and saarg for helping me along!  With any luck saarg will be able to incorporate this directly into his TVH container.  :)

 

Please feel free to provide feedback or ask questions.

 

John

Good work  ;D

Link to comment

You are in the same boat I was in.  You need to create the tv_grab_na_dd.conf file.  The way i did that was to docker exec into your TVH container and run:  tv_grab_na_dd --configure

 

It will walk you through by asking questions.  The resulting .conf file will be created in /config/.xmltv (symlinked to /nonexistent/.xmltv).  You then need to tell TVH to use the XMLTV: North America (Data Direct) internal grabber.  As long as the .conf file is in the right place, the grabber will use it.

 

However, as I experienced, TVH will not appear to pull any guide data in until you actually add the EPG source for each channel.

 

Or...you can use the zap2it container I just created and stop paying SD.  :)

 

John

 

Ok, still WTF!...

Thanks BTW..  ;)

 

I'm good now, I took a break, grabbed a beer, and read over your instruction (wouldn't quite say you "held my hand" but it was the extra push I needed).

 

So (for others who have never done this), ran docker exec for my container ID (a72b17f5c851)

docker exec -i a72b17f5c851 tv_grab_na_dd --configure

 

Answered the questions, finished, back to bash prompt..

No file..  >:(

Looked again, ran it again.. No file..  ???

So, ran this

docker exec -i a72b17f5c851 cp /root/.xmltv/tv_grab_na_dd.conf /config

Now I have the tv_grab_na_dd.conf file in the root directory...

I should be able to figure out the rest.

Unfortunately when you exec into the container you become root and not nobody, so the config file ends up in the wrong place.

I was supposed to write a guide on how to configure xmltv, but I guess it have slipped my mind  ::)

Link to comment

I think I actually did it!  :)

 

I have 2 containers for zap2xml perl script:

 

One is simply called zap2xml and is for users who need to scrape only one zipcode on zap2it.com.

 

The other is called zap2xml-2lineups.  This is for users (like me) who need to scrape guide data for 2 different zipcodes to cover all of their channels.  This requires 2 unique accounts/configurations on zap2it.com.  The resulting XMLs that are generated are merged into a single one at the end (guide.xml).

 

Docker Reporisitory Templates:  https://github.com/johnodon/unraid-templates

 

I tried to put as much instruction in the template as possible.  Be warned that I have not yet added the cron job to run the scrape every 12 or 24 hours.  In the meantime, just restart the zap2xml container every so often and your guide data will be refreshed.

 

Major props to Chis Pollard and saarg for helping me along!  With any luck saarg will be able to incorporate this directly into his TVH container.  :)

 

Please feel free to provide feedback or ask questions.

 

John

 

UPDATE:  Same changes were pushed to master zap2xml and new image has been built.  Both zap2xml and zap2xml-2lineups have been fully updated and built with the changes below.

 

Some updates made to the zap2xml-2lineups container and unraid template:

 

- Added cron job (editable in /config/) to run @ 11:30 and 23:30 daily

- Miscellaneous files cleanup

- unRAID template cleanup

 

Working on the master zap2xml now in dev and will update later.

 

Be sure you also refresh your unraid docker template repositories to get the updated templates.

 

John

Link to comment

Unfortunately when you exec into the container you become root and not nobody, so the config file ends up in the wrong place.

I was supposed to write a guide on how to configure xmltv, but I guess it have slipped my mind  ::)

 

No worries, but thank you for the confirmation.

I didn't actually need to edit that file at any time after, however I thought I did. That is, unless I did actually need it, and part of the reason it is working well now!  ;D

I copied it into the .xmltv directory just in case.

For the curious side of me, where is the "wrong place" you mention that it ends up at?

I assume this is in the base image, and not in the config directory?

 

 

Anyhow, I had never known anything about "exec" into a Docker, as I never had to do this before. When I 1st tried it, instead of exec, I was using attach.

This was not working as I needed, so I wasted plenty of time for what should have taken all of 5 minutes (once I knew what I was doing).

 

Since there are a LOT of options for EPG data, and many users in many different location, it is difficult to write a guide that would help everyone.

 

 

Ok, still WTF!...

Thanks BTW..  ;)

 

 

Yeah...I was working totally from memory while lying in bed with my iPad.  :)

 

It was appreciated regardless, and I'm good now.

 

Also, as to the zap2xml.. I still have my paid subscription to SD until July, so I may take it for a run/switch to it at that point.

Link to comment

 

For the curious side of me, where is the "wrong place" you mention that it ends up at?

I assume this is in the base image, and not in the config directory?

 

When inside the container, /root/.xmltv which is not the folder that is symlinked to /config/.xmltv

 

 

 

 

Also, as to the zap2xml.. I still have my paid subscription to SD until July, so I may take it for a run/switch to it at that point.

 

My sub to SD is done in March so this was good timing for me.  Just be warned that zap2it.com has a nasty habit of changing their data format.  Luckily, the author of zap2xml.com is pretty good about keeping up.  Worst case scenario, zap2xml also works with tvguide.com but I have yet to test.

Link to comment

When inside the container, /root/.xmltv which is not the folder that is symlinked to /config/.xmltv

Makes sense, got it!

 

My sub to SD is done in March so this was good timing for me.  Just be warned that zap2it.com has a nasty habit of changing their data format.  Luckily, the author of zap2xml.com is pretty good about keeping up.  Worst case scenario, zap2xml also works with tvguide.com but I have yet to test.

Good to know, and I think in a couple of months I'll take a stab at this.

Thanks for the free alternative!

Link to comment

I think I actually did it!  :)

 

I have 2 containers for zap2xml perl script:

 

One is simply called zap2xml and is for users who need to scrape only one zipcode on zap2it.com.

 

The other is called zap2xml-2lineups.  This is for users (like me) who need to scrape guide data for 2 different zipcodes to cover all of their channels.  This requires 2 unique accounts/configurations on zap2it.com.  The resulting XMLs that are generated are merged into a single one at the end (guide.xml).

 

Docker Reporisitory Templates:  https://github.com/johnodon/unraid-templates

 

I tried to put as much instruction in the template as possible.  Be warned that I have not yet added the cron job to run the scrape every 12 or 24 hours.  In the meantime, just restart the zap2xml container every so often and your guide data will be refreshed.

 

Major props to Chis Pollard and saarg for helping me along!  With any luck saarg will be able to incorporate this directly into his TVH container.  :)

 

Please feel free to provide feedback or ask questions.

 

John

 

UPDATE:  Same changes were pushed to master zap2xml and new image has been built.  Both zap2xml and zap2xml-2lineups have been fully updated and built with the changes below.

 

Some updates made to the zap2xml-2lineups container and unraid template:

 

- Added cron job (editable in /config/) to run @ 11:30 and 23:30 daily

- Miscellaneous files cleanup

- unRAID template cleanup

 

Working on the master zap2xml now in dev and will update later.

 

Be sure you also refresh your unraid docker template repositories to get the updated templates.

 

John

It's Friday and I'm lazy ;D So... Why do we need two different versions for more than one line up?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.