[Support] Linuxserver.io - Unifi-Controller


927 posts in this topic Last Reply

Recommended Posts

  • Replies 926
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Success, so far, I'm running 5.14.23-ls76 now.   So, to summarize, to those who wish to move from LTS (5.6.42) to 5.14.23-ls76, you must first use 5.10.24-ls21 to upgrade the database.

Sorry, I may have missed something completely earlier in discussions but mine also had this issue as it looks like dockerhub linuxserver.io shows version 6.1.71 attached to the tag 'latest' which it i

I was having adoption issues when i changed to this docker and then i remembered i had the same problem before after an upgrade and i had to change the docker to "host" from "bridge".   Rest

Posted Images

My sincerest apologies if I'm overlooking an obvious bug report, but I upgraded to 6.8.0 RC1, and I've been unable to access the management gui ever since. I assume this is just a webserver issue because the docker responds to pings just fine. Anyone else had this issue?

Link to post
2 minutes ago, joecoolman said:

My sincerest apologies if I'm overlooking an obvious bug report, but I upgraded to 6.8.0 RC1, and I've been unable to access the management gui ever since. I assume this is just a webserver issue because the docker responds to pings just fine. Anyone else had this issue?

I'll answer so you know it is not a general issue.  I am running 6.8.0-rc1 and just updated the UniFi container to the latest which is version 5.11.50 and I have no issues accessing the WebGUI of this or any other docker container.  All are functioning normally.

Link to post
8 minutes ago, joecoolman said:

My sincerest apologies if I'm overlooking an obvious bug report, but I upgraded to 6.8.0 RC1, and I've been unable to access the management gui ever since. I assume this is just a webserver issue because the docker responds to pings just fine. Anyone else had this issue?

Are you running the docker using a custom ip? I had to change it to host for me to connect. Seems there is a problem with custom ip br0 on 6.8

Link to post
4 minutes ago, joecoolman said:

Yeah, I'm running it on a separate br0. That's probably what it is. Thanks!

Although I am not running the UniFi docker with its own IP address, I am running several other docker containers with their own IP addresses on a VLAN (br0.3) without issue.  Perhaps the problem is limited to br0 which I am not using.

Link to post

Hello,

 

I have used the Unifi Controller on my laptop until now and would like to migrate all of this to an Unraid Docker container.

 

But there are two points I don't understand in the Readme on Dockerhub:

 

DHCP and USG

 

Quote

Common problems

 

When using a Security Gateway (router) it could be that network connected devices are unable to obtain an ip address. This can be fixed by setting "DHCP Gateway IP", under Settings > Networks > network_name, to a correct (and accessable) ip address.

 

I don't understand the actual problem, nor the solution proposed.

 

  • What is meant by "network connected devices"? Aren't all devices that try to get an IP from the DHCP Server "network connected devices"?
  • Is the solution to set the IP address of the USG in this field?

 

Device Adoption

 

Quote

Application Setup

[…]

 

For Unifi to adopt other devices, e.g. an Access Point, it is required to change the inform ip address. Because Unifi runs inside Docker by default it uses an ip address not accessable by other devices. To change this go to Settings > Controller > Controller Settings and set the Controller Hostname/IP to an ip address accessable by other devices.

 

What is the IP address I should add here? Is it the IP address of the host, meaning the IP address of my Unraid server?

 

Corollary Question

 

Would I avoid all these problems by using a host network?

Edited by caillou
Link to post
  • 2 weeks later...

Somehow I cannot access the WebUI after using the latest update. I have tried changing the network settings between Host and br0, none of which gave me luck.

Does anyone know if this is a known bug and whether or not this will be fixed? Currently I cannot access my configuration which is really unfortunate.

Link to post
1 hour ago, DavyV97 said:

Somehow I cannot access the WebUI after using the latest update. I have tried changing the network settings between Host and br0, none of which gave me luck.

Does anyone know if this is a known bug and whether or not this will be fixed? Currently I cannot access my configuration which is really unfortunate.

Without any logs we can't help.

Post the container log and check in the appdata folder if there are any logs.

Link to post

It would appear to be some conflict with Wireguard, at least for me... if I turn off Wireguard, I can access the Unifi GUI.

 

I have my Unifi Controller on a separate IP address, configured with br0 - somehow, even though I'm "at home" and connected locally and not coming through Wireguard, it seems to mess up access.

 

There's some discussion on the Wireguard thread, but it's beyond me.

Link to post
2 hours ago, bdillahu said:

I have my Unifi Controller on a separate IP address, configured with br0 - somehow, even though I'm "at home" and connected locally and not coming through Wireguard, it seems to mess up access.

Yeah, that should not matter locally.

 

It definitely will not work over a remote WG connection.  br0 IP addresses are not accessible remotely through WireGuard.  You would need a VLAN (br0.X) for containers with custom IP addresses to be available over a remote WG connection.

Link to post
On 10/17/2019 at 11:33 PM, joecoolman said:

My sincerest apologies if I'm overlooking an obvious bug report, but I upgraded to 6.8.0 RC1, and I've been unable to access the management gui ever since. I assume this is just a webserver issue because the docker responds to pings just fine. Anyone else had this issue?

 

On 10/29/2019 at 2:43 PM, DavyV97 said:

Somehow I cannot access the WebUI after using the latest update. I have tried changing the network settings between Host and br0, none of which gave me luck.

Does anyone know if this is a known bug and whether or not this will be fixed? Currently I cannot access my configuration which is really unfortunate.

I had the same issue. Lost access after the 6.8.0 RC1 update I did on Oct 12, and finally noticed it today. I had this error in my Unifi log file, and nothing else, for the past 18 days:

[2019-10-12 13:46:15,023] <db-server> ERROR system - [exec] error, rc=134

Container logs showed nothing interesting. I was using the "Bridge" network mode.

 

I ended up fixing it, but I'm not sure exactly what did it. I did a few things:

  1. Since the log was showing a db error, I did a MongoDB database repair with these instructions. This seemed to have no effect.
  2. Change network type from Bridge to Host, didn't work.
  3. Lots of container restarts and force updates.
  4. Following something I read here when investigating the big "sqlite corruption" issue a while back, I changed the config folder mapping from /mnt/user/appdata/unifi-controller to /mnt/cache/appdata/unifi-controller. I think this may have done it, but I don't want to break it again to check.
  5. I also changed from Host to br0, then back to Bridge. I don't think this did anything, but I did it at the same time as the previous.
  6. I don't have Wireguard set up, so it's probably not that.
Link to post
3 hours ago, JibbsIsMe said:

I ended up fixing it, but I'm not sure exactly what did it. I did a few things:

  1. Since the log was showing a db error, I did a MongoDB database repair with these instructions. This seemed to have no effect.
  2. Change network type from Bridge to Host, didn't work.
  3. Lots of container restarts and force updates.
  4. Following something I read here when investigating the big "sqlite corruption" issue a while back, I changed the config folder mapping from /mnt/user/appdata/unifi-controller to /mnt/cache/appdata/unifi-controller. I think this may have done it, but I don't want to break it again to check.
  5. I also changed from Host to br0, then back to Bridge. I don't think this did anything, but I did it at the same time as the previous.
  6. I don't have Wireguard set up, so it's probably not that.

 

I tried changing appdata to cache - no change.

 

I was using br0 with a static IP... changed to Bridge mode and can get to it on http, not https... scratch that, I lied... I can get on https... as long as I'm on bridge

 

Seems to still be something with network side, I think.

Edited by bdillahu
Link to post
On 10/29/2019 at 8:56 PM, saarg said:

Without any logs we can't help.

Post the container log and check in the appdata folder if there are any logs.

The logs shown by UnRaid:

[s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 10-adduser: executing...

-------------------------------------
_ ()
| | ___ _ __
| | / __| | | / \
| | \__ \ | | | () |
|_| |___/ |_| \__/


Brought to you by linuxserver.io
We gratefully accept donations at:
https://www.linuxserver.io/donate/
-------------------------------------
GID/UID
-------------------------------------

User uid: 99
User gid: 100
-------------------------------------

[cont-init.d] 10-adduser: exited 0.
[cont-init.d] 20-config: executing...
[cont-init.d] 20-config: exited 0.
[cont-init.d] 30-keygen: executing...
[cont-init.d] 30-keygen: exited 0.
[cont-init.d] 99-custom-scripts: executing...
[custom-init] no custom files found exiting...
[cont-init.d] 99-custom-scripts: exited 0.
[cont-init.d] done.
[services.d] starting services
[services.d] done.

The server.log shows the following:

[2019-10-29 19:46:12,172] <launcher> ERROR db     - Got error while connecting to db...
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=127.0.0.1:27117, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:377)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:104)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:90)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:836)
	at com.mongodb.Mongo$2.execute(Mongo.java:823)
	at com.mongodb.DB.executeCommand(DB.java:729)
	at com.mongodb.DB.command(DB.java:491)
	at com.mongodb.DB.command(DB.java:507)
	at com.mongodb.DB.command(DB.java:449)
	at com.ubnt.service.OoOO.W.OÒ0000(Unknown Source)
	at com.ubnt.service.OoOO.W.afterPropertiesSet(Unknown Source)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1758)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1695)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:573)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.dbService(<generated>)
	at com.ubnt.service.AppContext.statService(Unknown Source)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$statService$9(<generated>)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>)
	at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>)
	at com.ubnt.service.AppContext.houseKeeper(Unknown Source)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$houseKeeper$17(<generated>)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>)
	at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.houseKeeper(<generated>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:759)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:88)
	at com.ubnt.service.B.Oo0000(Unknown Source)
	at com.ubnt.service.B.Õ00000(Unknown Source)
	at com.ubnt.ace.Launcher.main(Unknown Source)
[2019-10-29T19:46:54,193] <localhost-startStop-1> INFO  system - ======================================================================
[2019-10-29T19:46:54,200] <localhost-startStop-1> INFO  system - UniFi 5.11.50 (build atag_5.11.50_12745 - release/release) is started
[2019-10-29T19:46:54,200] <localhost-startStop-1> INFO  system - ======================================================================
[2019-10-29T19:46:54,201] <localhost-startStop-1> INFO  system - BASE dir:/usr/lib/unifi
[2019-10-29T19:46:54,273] <localhost-startStop-1> INFO  system - Current System IP: 172.30.32.1
[2019-10-29T19:46:54,273] <localhost-startStop-1> INFO  system - Hostname: NAS
[2019-10-29T19:56:54,276] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T19:56:54,277] <db-server> WARN  db     - Unknown error, restarting mongo without logging to verify error
[2019-10-29T20:07:09,495] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:17:47,545] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:27:24,934] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:36:54,339] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:48:31,788] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:59:08,710] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T21:08:54,787] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T21:19:59,955] <db-server> WARN  db     - Mongo start up failed with rc=134

Apparently those Mongo start up fails have been appearing since 28-09-2019, which might be when I updated. These failures are still occuring to this date. 

The mongod.log shows this:

2019-10-29T19:56:53.112+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:112068][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14399 through 14414
2019-10-29T19:56:53.156+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:156585][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14400 through 14414
2019-10-29T19:56:53.189+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:189946][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14401 through 14414
2019-10-29T19:56:53.223+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:223168][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14402 through 14414
2019-10-29T19:56:53.278+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:278849][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14403 through 14414
2019-10-29T19:56:53.312+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:312130][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14404 through 14414
2019-10-29T19:56:53.345+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:345406][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14405 through 14414
2019-10-29T19:56:53.378+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:378795][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14406 through 14414
2019-10-29T19:56:53.412+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:412067][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14407 through 14414
2019-10-29T19:56:53.456+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:456668][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14408 through 14414
2019-10-29T19:56:53.490+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:489995][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14409 through 14414
2019-10-29T19:56:53.530+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:530956][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14410 through 14414
2019-10-29T19:56:53.686+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:686403][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14411 through 14414
2019-10-29T19:56:53.853+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:853061][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14412 through 14414
2019-10-29T19:56:54.008+0100 I STORAGE  [initandlisten] WiredTiger message [1572375414:8626][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14413 through 14414
2019-10-29T19:56:54.167+0100 I STORAGE  [initandlisten] WiredTiger message [1572375414:167723][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14414 through 14414
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (0) [1572375414:247549][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: read checksum error for 4096B block at offset 4440064: block header checksum of 1413291893 doesn't match expected checksum of 1740413080
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (0) [1572375414:247658][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: collection-179--1479618200839713417.wt: encountered an illegal file format or internal value
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (-31804) [1572375414:247719][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: the process must exit and restart: WT_PANIC: WiredTiger library panic
2019-10-29T19:56:54.247+0100 I -        [initandlisten] Fatal Assertion 28558 at src/mongo/db/storage/wiredtiger/wiredtiger_util.cpp 365
2019-10-29T19:56:54.247+0100 I -        [initandlisten] 

***aborting after fassert() failure


2019-10-29T19:56:54.272+0100 F -        [initandlisten] Got signal: 6 (Aborted).

 0x564eb09cbad1 0x564eb09cace9 0x564eb09cb1cd 0x14700843b390 0x147008095428 0x14700809702a 0x564eafc5b5a7 0x564eb06cfff6 0x564eafc65c24 0x564eafc65e49 0x564eafc660ab 0x564eb12d865f 0x564eb12d49ca 0x564eb12d5b93 0x564eb12f5f81 0x564eb139457a 0x564eb13992f2 0x564eb130c9f5 0x564eb13c50ef 0x564eb13c6507 0x564eb13c7559 0x564eb13b4291 0x564eb13cc41a 0x564eb1330da7 0x564eb132921b 0x564eb06b441f 0x564eb06acb12 0x564eb059f750 0x564eafc46463 0x564eafc67496 0x147008080830 0x564eafcc7879
----- BEGIN BACKTRACE -----
{"backtrace":[{"b":"564EAF418000","o":"15B3AD1","s":"_ZN5mongo15printStackTraceERSo"},{"b":"564EAF418000","o":"15B2CE9"},{"b":"564EAF418000","o":"15B31CD"},{"b":"14700842A000","o":"11390"},{"b":"147008060000","o":"35428","s":"gsignal"},{"b":"147008060000","o":"3702A","s":"abort"},{"b":"564EAF418000","o":"8435A7","s":"_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj"},{"b":"564EAF418000","o":"12B7FF6"},{"b":"564EAF418000","o":"84DC24","s":"__wt_eventv"},{"b":"564EAF418000","o":"84DE49","s":"__wt_err"},{"b":"564EAF418000","o":"84E0AB","s":"__wt_panic"},{"b":"564EAF418000","o":"1EC065F","s":"__wt_block_extlist_read"},{"b":"564EAF418000","o":"1EBC9CA"},{"b":"564EAF418000","o":"1EBDB93","s":"__wt_block_checkpoint"},{"b":"564EAF418000","o":"1EDDF81","s":"__wt_bt_write"},{"b":"564EAF418000","o":"1F7C57A"},{"b":"564EAF418000","o":"1F812F2","s":"__wt_reconcile"},{"b":"564EAF418000","o":"1EF49F5","s":"__wt_cache_op"},{"b":"564EAF418000","o":"1FAD0EF"},{"b":"564EAF418000","o":"1FAE507"},{"b":"564EAF418000","o":"1FAF559","s":"__wt_txn_checkpoint"},{"b":"564EAF418000","o":"1F9C291"},{"b":"564EAF418000","o":"1FB441A","s":"__wt_txn_recover"},{"b":"564EAF418000","o":"1F18DA7","s":"__wt_connection_workers"},{"b":"564EAF418000","o":"1F1121B","s":"wiredtiger_open"},{"b":"564EAF418000","o":"129C41F","s":"_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb"},{"b":"564EAF418000","o":"1294B12"},{"b":"564EAF418000","o":"1187750","s":"_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv"},{"b":"564EAF418000","o":"82E463"},{"b":"564EAF418000","o":"84F496","s":"main"},{"b":"147008060000","o":"20830","s":"__libc_start_main"},{"b":"564EAF418000","o":"8AF879","s":"_start"}],"processInfo":{ "mongodbVersion" : "3.4.23", "gitVersion" : "324017ede1dbb1c9554dd2dceb15f8da3c59d0e8", "compiledModules" : [], "uname" : { "sysname" : "Linux", "release" : "4.19.56-Unraid", "version" : "#1 SMP Tue Jun 25 10:19:34 PDT 2019", "machine" : "x86_64" }, "somap" : [ { "b" : "564EAF418000", "elfType" : 3, "buildId" : "91B53A60D2F6A2BE28D415B74844C8722A21A4FB" }, { "b" : "7FFF7D9FB000", "elfType" : 3, "buildId" : "66C7D3E7CFA7FD6793FA4CD5E237FFD24E2F88F8" }, { "b" : "1470093B7000", "path" : "/lib/x86_64-linux-gnu/libssl.so.1.0.0", "elfType" : 3, "buildId" : "FF69EA60EBE05F2DD689D2B26FC85A73E5FBC3A0" }, { "b" : "147008F72000", "path" : "/lib/x86_64-linux-gnu/libcrypto.so.1.0.0", "elfType" : 3, "buildId" : "15FFEB43278726B025F020862BF51302822A40EC" }, { "b" : "147008D6A000", "path" : "/lib/x86_64-linux-gnu/librt.so.1", "elfType" : 3, "buildId" : "69143E8B39040C964D3958490535322675F15DD3" }, { "b" : "147008B66000", "path" : "/lib/x86_64-linux-gnu/libdl.so.2", "elfType" : 3, "buildId" : "37BFC3D8F7E3B022DAC7943B1A5FACD40CEBF0AD" }, { "b" : "14700885D000", "path" : "/lib/x86_64-linux-gnu/libm.so.6", "elfType" : 3, "buildId" : "BAD67A84E56E73D031AE507261DA066B35949D34" }, { "b" : "147008647000", "path" : "/lib/x86_64-linux-gnu/libgcc_s.so.1", "elfType" : 3, "buildId" : "68220AE2C65D65C1B6AAA12FA6765A6EC2F5F434" }, { "b" : "14700842A000", "path" : "/lib/x86_64-linux-gnu/libpthread.so.0", "elfType" : 3, "buildId" : "B17C21299099640A6D863E423D99265824E7BB16" }, { "b" : "147008060000", "path" : "/lib/x86_64-linux-gnu/libc.so.6", "elfType" : 3, "buildId" : "1CA54A6E0D76188105B12E49FE6B8019BF08803A" }, { "b" : "147009620000", "path" : "/lib64/ld-linux-x86-64.so.2", "elfType" : 3, "buildId" : "C0ADBAD6F9A33944F2B3567C078EC472A1DAE98E" } ] }}
 mongod(_ZN5mongo15printStackTraceERSo+0x41) [0x564eb09cbad1]
 mongod(+0x15B2CE9) [0x564eb09cace9]
 mongod(+0x15B31CD) [0x564eb09cb1cd]
 libpthread.so.0(+0x11390) [0x14700843b390]
 libc.so.6(gsignal+0x38) [0x147008095428]
 libc.so.6(abort+0x16A) [0x14700809702a]
 mongod(_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj+0x0) [0x564eafc5b5a7]
 mongod(+0x12B7FF6) [0x564eb06cfff6]
 mongod(__wt_eventv+0x3D7) [0x564eafc65c24]
 mongod(__wt_err+0x9D) [0x564eafc65e49]
 mongod(__wt_panic+0x2E) [0x564eafc660ab]
 mongod(__wt_block_extlist_read+0x8F) [0x564eb12d865f]
 mongod(+0x1EBC9CA) [0x564eb12d49ca]
 mongod(__wt_block_checkpoint+0x673) [0x564eb12d5b93]
 mongod(__wt_bt_write+0x4F1) [0x564eb12f5f81]
 mongod(+0x1F7C57A) [0x564eb139457a]
 mongod(__wt_reconcile+0x1272) [0x564eb13992f2]
 mongod(__wt_cache_op+0x875) [0x564eb130c9f5]
 mongod(+0x1FAD0EF) [0x564eb13c50ef]
 mongod(+0x1FAE507) [0x564eb13c6507]
 mongod(__wt_txn_checkpoint+0xD9) [0x564eb13c7559]
 mongod(+0x1F9C291) [0x564eb13b4291]
 mongod(__wt_txn_recover+0x5FA) [0x564eb13cc41a]
 mongod(__wt_connection_workers+0x37) [0x564eb1330da7]
 mongod(wiredtiger_open+0x197B) [0x564eb132921b]
 mongod(_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb+0x70F) [0x564eb06b441f]
 mongod(+0x1294B12) [0x564eb06acb12]
 mongod(_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv+0x6B0) [0x564eb059f750]
 mongod(+0x82E463) [0x564eafc46463]
 mongod(main+0x966) [0x564eafc67496]
 libc.so.6(__libc_start_main+0xF0) [0x147008080830]
 mongod(_start+0x29) [0x564eafcc7879]
-----  END BACKTRACE  -----

I am using the :latest version of the Unifi Controller on UnRaid 6.7.2 and tried connecting using both Chrome, Firefox and even Edge. This using Host and br0 configurations. Host configuration has always worked for me.

Link to post
4 hours ago, DavyV97 said:

The logs shown by UnRaid:


[s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 10-adduser: executing...

-------------------------------------
_ ()
| | ___ _ __
| | / __| | | / \
| | \__ \ | | | () |
|_| |___/ |_| \__/


Brought to you by linuxserver.io
We gratefully accept donations at:
https://www.linuxserver.io/donate/
-------------------------------------
GID/UID
-------------------------------------

User uid: 99
User gid: 100
-------------------------------------

[cont-init.d] 10-adduser: exited 0.
[cont-init.d] 20-config: executing...
[cont-init.d] 20-config: exited 0.
[cont-init.d] 30-keygen: executing...
[cont-init.d] 30-keygen: exited 0.
[cont-init.d] 99-custom-scripts: executing...
[custom-init] no custom files found exiting...
[cont-init.d] 99-custom-scripts: exited 0.
[cont-init.d] done.
[services.d] starting services
[services.d] done.

The server.log shows the following:


[2019-10-29 19:46:12,172] <launcher> ERROR db     - Got error while connecting to db...
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=127.0.0.1:27117, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:377)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:104)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:90)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:836)
	at com.mongodb.Mongo$2.execute(Mongo.java:823)
	at com.mongodb.DB.executeCommand(DB.java:729)
	at com.mongodb.DB.command(DB.java:491)
	at com.mongodb.DB.command(DB.java:507)
	at com.mongodb.DB.command(DB.java:449)
	at com.ubnt.service.OoOO.W.OÒ0000(Unknown Source)
	at com.ubnt.service.OoOO.W.afterPropertiesSet(Unknown Source)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1758)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1695)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:573)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.dbService(<generated>)
	at com.ubnt.service.AppContext.statService(Unknown Source)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$statService$9(<generated>)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>)
	at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>)
	at com.ubnt.service.AppContext.houseKeeper(Unknown Source)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$houseKeeper$17(<generated>)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>)
	at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
	at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361)
	at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.houseKeeper(<generated>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:759)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:88)
	at com.ubnt.service.B.Oo0000(Unknown Source)
	at com.ubnt.service.B.Õ00000(Unknown Source)
	at com.ubnt.ace.Launcher.main(Unknown Source)
[2019-10-29T19:46:54,193] <localhost-startStop-1> INFO  system - ======================================================================
[2019-10-29T19:46:54,200] <localhost-startStop-1> INFO  system - UniFi 5.11.50 (build atag_5.11.50_12745 - release/release) is started
[2019-10-29T19:46:54,200] <localhost-startStop-1> INFO  system - ======================================================================
[2019-10-29T19:46:54,201] <localhost-startStop-1> INFO  system - BASE dir:/usr/lib/unifi
[2019-10-29T19:46:54,273] <localhost-startStop-1> INFO  system - Current System IP: 172.30.32.1
[2019-10-29T19:46:54,273] <localhost-startStop-1> INFO  system - Hostname: NAS
[2019-10-29T19:56:54,276] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T19:56:54,277] <db-server> WARN  db     - Unknown error, restarting mongo without logging to verify error
[2019-10-29T20:07:09,495] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:17:47,545] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:27:24,934] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:36:54,339] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:48:31,788] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T20:59:08,710] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T21:08:54,787] <db-server> WARN  db     - Mongo start up failed with rc=134
[2019-10-29T21:19:59,955] <db-server> WARN  db     - Mongo start up failed with rc=134

Apparently those Mongo start up fails have been appearing since 28-09-2019, which might be when I updated. These failures are still occuring to this date. 

The mongod.log shows this:


2019-10-29T19:56:53.112+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:112068][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14399 through 14414
2019-10-29T19:56:53.156+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:156585][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14400 through 14414
2019-10-29T19:56:53.189+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:189946][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14401 through 14414
2019-10-29T19:56:53.223+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:223168][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14402 through 14414
2019-10-29T19:56:53.278+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:278849][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14403 through 14414
2019-10-29T19:56:53.312+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:312130][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14404 through 14414
2019-10-29T19:56:53.345+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:345406][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14405 through 14414
2019-10-29T19:56:53.378+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:378795][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14406 through 14414
2019-10-29T19:56:53.412+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:412067][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14407 through 14414
2019-10-29T19:56:53.456+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:456668][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14408 through 14414
2019-10-29T19:56:53.490+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:489995][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14409 through 14414
2019-10-29T19:56:53.530+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:530956][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14410 through 14414
2019-10-29T19:56:53.686+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:686403][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14411 through 14414
2019-10-29T19:56:53.853+0100 I STORAGE  [initandlisten] WiredTiger message [1572375413:853061][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14412 through 14414
2019-10-29T19:56:54.008+0100 I STORAGE  [initandlisten] WiredTiger message [1572375414:8626][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14413 through 14414
2019-10-29T19:56:54.167+0100 I STORAGE  [initandlisten] WiredTiger message [1572375414:167723][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14414 through 14414
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (0) [1572375414:247549][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: read checksum error for 4096B block at offset 4440064: block header checksum of 1413291893 doesn't match expected checksum of 1740413080
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (0) [1572375414:247658][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: collection-179--1479618200839713417.wt: encountered an illegal file format or internal value
2019-10-29T19:56:54.247+0100 E STORAGE  [initandlisten] WiredTiger error (-31804) [1572375414:247719][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: the process must exit and restart: WT_PANIC: WiredTiger library panic
2019-10-29T19:56:54.247+0100 I -        [initandlisten] Fatal Assertion 28558 at src/mongo/db/storage/wiredtiger/wiredtiger_util.cpp 365
2019-10-29T19:56:54.247+0100 I -        [initandlisten] 

***aborting after fassert() failure


2019-10-29T19:56:54.272+0100 F -        [initandlisten] Got signal: 6 (Aborted).

 0x564eb09cbad1 0x564eb09cace9 0x564eb09cb1cd 0x14700843b390 0x147008095428 0x14700809702a 0x564eafc5b5a7 0x564eb06cfff6 0x564eafc65c24 0x564eafc65e49 0x564eafc660ab 0x564eb12d865f 0x564eb12d49ca 0x564eb12d5b93 0x564eb12f5f81 0x564eb139457a 0x564eb13992f2 0x564eb130c9f5 0x564eb13c50ef 0x564eb13c6507 0x564eb13c7559 0x564eb13b4291 0x564eb13cc41a 0x564eb1330da7 0x564eb132921b 0x564eb06b441f 0x564eb06acb12 0x564eb059f750 0x564eafc46463 0x564eafc67496 0x147008080830 0x564eafcc7879
----- BEGIN BACKTRACE -----
{"backtrace":[{"b":"564EAF418000","o":"15B3AD1","s":"_ZN5mongo15printStackTraceERSo"},{"b":"564EAF418000","o":"15B2CE9"},{"b":"564EAF418000","o":"15B31CD"},{"b":"14700842A000","o":"11390"},{"b":"147008060000","o":"35428","s":"gsignal"},{"b":"147008060000","o":"3702A","s":"abort"},{"b":"564EAF418000","o":"8435A7","s":"_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj"},{"b":"564EAF418000","o":"12B7FF6"},{"b":"564EAF418000","o":"84DC24","s":"__wt_eventv"},{"b":"564EAF418000","o":"84DE49","s":"__wt_err"},{"b":"564EAF418000","o":"84E0AB","s":"__wt_panic"},{"b":"564EAF418000","o":"1EC065F","s":"__wt_block_extlist_read"},{"b":"564EAF418000","o":"1EBC9CA"},{"b":"564EAF418000","o":"1EBDB93","s":"__wt_block_checkpoint"},{"b":"564EAF418000","o":"1EDDF81","s":"__wt_bt_write"},{"b":"564EAF418000","o":"1F7C57A"},{"b":"564EAF418000","o":"1F812F2","s":"__wt_reconcile"},{"b":"564EAF418000","o":"1EF49F5","s":"__wt_cache_op"},{"b":"564EAF418000","o":"1FAD0EF"},{"b":"564EAF418000","o":"1FAE507"},{"b":"564EAF418000","o":"1FAF559","s":"__wt_txn_checkpoint"},{"b":"564EAF418000","o":"1F9C291"},{"b":"564EAF418000","o":"1FB441A","s":"__wt_txn_recover"},{"b":"564EAF418000","o":"1F18DA7","s":"__wt_connection_workers"},{"b":"564EAF418000","o":"1F1121B","s":"wiredtiger_open"},{"b":"564EAF418000","o":"129C41F","s":"_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb"},{"b":"564EAF418000","o":"1294B12"},{"b":"564EAF418000","o":"1187750","s":"_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv"},{"b":"564EAF418000","o":"82E463"},{"b":"564EAF418000","o":"84F496","s":"main"},{"b":"147008060000","o":"20830","s":"__libc_start_main"},{"b":"564EAF418000","o":"8AF879","s":"_start"}],"processInfo":{ "mongodbVersion" : "3.4.23", "gitVersion" : "324017ede1dbb1c9554dd2dceb15f8da3c59d0e8", "compiledModules" : [], "uname" : { "sysname" : "Linux", "release" : "4.19.56-Unraid", "version" : "#1 SMP Tue Jun 25 10:19:34 PDT 2019", "machine" : "x86_64" }, "somap" : [ { "b" : "564EAF418000", "elfType" : 3, "buildId" : "91B53A60D2F6A2BE28D415B74844C8722A21A4FB" }, { "b" : "7FFF7D9FB000", "elfType" : 3, "buildId" : "66C7D3E7CFA7FD6793FA4CD5E237FFD24E2F88F8" }, { "b" : "1470093B7000", "path" : "/lib/x86_64-linux-gnu/libssl.so.1.0.0", "elfType" : 3, "buildId" : "FF69EA60EBE05F2DD689D2B26FC85A73E5FBC3A0" }, { "b" : "147008F72000", "path" : "/lib/x86_64-linux-gnu/libcrypto.so.1.0.0", "elfType" : 3, "buildId" : "15FFEB43278726B025F020862BF51302822A40EC" }, { "b" : "147008D6A000", "path" : "/lib/x86_64-linux-gnu/librt.so.1", "elfType" : 3, "buildId" : "69143E8B39040C964D3958490535322675F15DD3" }, { "b" : "147008B66000", "path" : "/lib/x86_64-linux-gnu/libdl.so.2", "elfType" : 3, "buildId" : "37BFC3D8F7E3B022DAC7943B1A5FACD40CEBF0AD" }, { "b" : "14700885D000", "path" : "/lib/x86_64-linux-gnu/libm.so.6", "elfType" : 3, "buildId" : "BAD67A84E56E73D031AE507261DA066B35949D34" }, { "b" : "147008647000", "path" : "/lib/x86_64-linux-gnu/libgcc_s.so.1", "elfType" : 3, "buildId" : "68220AE2C65D65C1B6AAA12FA6765A6EC2F5F434" }, { "b" : "14700842A000", "path" : "/lib/x86_64-linux-gnu/libpthread.so.0", "elfType" : 3, "buildId" : "B17C21299099640A6D863E423D99265824E7BB16" }, { "b" : "147008060000", "path" : "/lib/x86_64-linux-gnu/libc.so.6", "elfType" : 3, "buildId" : "1CA54A6E0D76188105B12E49FE6B8019BF08803A" }, { "b" : "147009620000", "path" : "/lib64/ld-linux-x86-64.so.2", "elfType" : 3, "buildId" : "C0ADBAD6F9A33944F2B3567C078EC472A1DAE98E" } ] }}
 mongod(_ZN5mongo15printStackTraceERSo+0x41) [0x564eb09cbad1]
 mongod(+0x15B2CE9) [0x564eb09cace9]
 mongod(+0x15B31CD) [0x564eb09cb1cd]
 libpthread.so.0(+0x11390) [0x14700843b390]
 libc.so.6(gsignal+0x38) [0x147008095428]
 libc.so.6(abort+0x16A) [0x14700809702a]
 mongod(_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj+0x0) [0x564eafc5b5a7]
 mongod(+0x12B7FF6) [0x564eb06cfff6]
 mongod(__wt_eventv+0x3D7) [0x564eafc65c24]
 mongod(__wt_err+0x9D) [0x564eafc65e49]
 mongod(__wt_panic+0x2E) [0x564eafc660ab]
 mongod(__wt_block_extlist_read+0x8F) [0x564eb12d865f]
 mongod(+0x1EBC9CA) [0x564eb12d49ca]
 mongod(__wt_block_checkpoint+0x673) [0x564eb12d5b93]
 mongod(__wt_bt_write+0x4F1) [0x564eb12f5f81]
 mongod(+0x1F7C57A) [0x564eb139457a]
 mongod(__wt_reconcile+0x1272) [0x564eb13992f2]
 mongod(__wt_cache_op+0x875) [0x564eb130c9f5]
 mongod(+0x1FAD0EF) [0x564eb13c50ef]
 mongod(+0x1FAE507) [0x564eb13c6507]
 mongod(__wt_txn_checkpoint+0xD9) [0x564eb13c7559]
 mongod(+0x1F9C291) [0x564eb13b4291]
 mongod(__wt_txn_recover+0x5FA) [0x564eb13cc41a]
 mongod(__wt_connection_workers+0x37) [0x564eb1330da7]
 mongod(wiredtiger_open+0x197B) [0x564eb132921b]
 mongod(_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb+0x70F) [0x564eb06b441f]
 mongod(+0x1294B12) [0x564eb06acb12]
 mongod(_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv+0x6B0) [0x564eb059f750]
 mongod(+0x82E463) [0x564eafc46463]
 mongod(main+0x966) [0x564eafc67496]
 libc.so.6(__libc_start_main+0xF0) [0x147008080830]
 mongod(_start+0x29) [0x564eafcc7879]
-----  END BACKTRACE  -----

I am using the :latest version of the Unifi Controller on UnRaid 6.7.2 and tried connecting using both Chrome, Firefox and even Edge. This using Host and br0 configurations. Host configuration has always worked for me.

Could be a corrupted database? restore a backup of your appdata and see what happens.

Link to post

For those with adoption problems, this seems to have done the trick.

 

It'd been mentioned here, but without a walkthrough. For some reason, the GUI approach didn't work for me.

 

TL;DR:

 

ssh into the AP.

 

then

 

mca-cli

 

Now issue the set-inform command with the IP address of your Unifi controller [in our case, the IP of your UnRaid box. Don't forget to change to the correct port if you're not allocating 8080 to the controller].

 

set-inform http://192.168.3.2:8080/inform

Link to post
On 10/31/2019 at 2:09 PM, j0nnymoe said:

Could be a corrupted database? restore a backup of your appdata and see what happens.

Seems that this solved the issue, I had already tried removing and installing the docker. 

However, this would not delete the appdata. Removed the docker and renamed the folder, re-installing the docker and importing a back-up solved the issue. 

Thanks!

Link to post

I'm trying to run the image in k8s but can't get it to work with permanent storage. As long as I don't define a persistent volume it works as intended. When I define a hostpath volume for the /config directory the controller process refuses all connections. The pod creates three directories in the mounted hostvolume: data, logs and run. The data directory contains a binary file named keystore, the others are empty.

 

Inside the pod there's a java process running using the user abc 911:911 and command, java -Xmx1024M -jar /usr/lib/unifi/lib/ace.jar

 

The unifi service is not running. No sign of any mongodb process or service.

 

I'm not sure how to troubleshoot this. Any pointers are appreciated. It's obviously related to the storage.

Link to post
8 hours ago, zygmunt said:

I'm trying to run the image in k8s but can't get it to work with permanent storage. As long as I don't define a persistent volume it works as intended. When I define a hostpath volume for the /config directory the controller process refuses all connections. The pod creates three directories in the mounted hostvolume: data, logs and run. The data directory contains a binary file named keystore, the others are empty.

 

Inside the pod there's a java process running using the user abc 911:911 and command, java -Xmx1024M -jar /usr/lib/unifi/lib/ace.jar

 

The unifi service is not running. No sign of any mongodb process or service.

 

I'm not sure how to troubleshoot this. Any pointers are appreciated. It's obviously related to the storage.

This thread is for users who need support on unraid.

Please use our forum or discord for support.

Link to post

I just installed :latest today on 6.8.0-rc5 (WireGuard installed but not configured or running) and am having problems similar to above---no Web UI, and my Unifi Android App can't see the controller.  I have tried both Bridge and br0 with a dedicated IP address.  Same issues with :LTS.

 

I tried netcat on the exposed ports.  There's something listening on 8080 and 8880, but it doesn't respond to HTTP.  Connection is refused on 8081, 8443, 8843, and 10001.

 

The log has a couple of errors in it.  The first appears to to be logging related, but the second is an IO error that might be more relevant.

 

Quote

2019-11-11 01:52:33,270 db-server ERROR Recovering from StringBuilderEncoder.encode('[2019-11-10T20:52:33,269] <db-server> WARN db - Unknown error, restarting mongo without logging to verify error


') error: org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to stream logs/server.log org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to stream logs/server.log


at org.apache.logging.log4j.core.appender.OutputStreamManager.writeToDestination(OutputStreamManager.java:263)
at org.apache.logging.log4j.core.appender.FileManager.writeToDestination(FileManager.java:261)
at org.apache.logging.log4j.core.appender.rolling.RollingFileManager.writeToDestination(RollingFileManager.java:219)
at org.apache.logging.log4j.core.appender.OutputStreamManager.flushBuffer(OutputStreamManager.java:293)

[...]

at com.ubnt.service.C.voidsuper.Ôo0000(Unknown Source)
at com.ubnt.service.C.voidsuper.o00000(Unknown Source)
at com.ubnt.service.C.voidsuper$1.run(Unknown Source)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Input/output error


at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:326)
at org.apache.logging.log4j.core.appender.OutputStreamManager.writeToDestination(OutputStreamManager.java:261)
... 39 more

 

Edited by CJW
Link to post
2 hours ago, CJW said:

I just installed :latest today on 6.8.0-rc5 (WireGuard installed but not configured or running) and am having problems similar to above---no Web UI, and my Unifi Android App can't see the controller.  I have tried both Bridge and br0 with a dedicated IP address.  Same issues with :LTS.

 

I tried netcat on the exposed ports.  There's something listening on 8080 and 8880, but it doesn't respond to HTTP.  Connection is refused on 8081, 8443, 8843, and 10001.

 

The log has a couple of errors in it.  The first appears to to be logging related, but the second is an IO error that might be more relevant.

 

 

Looks like you have issues with a drive or the database is corrupt. Try using a backup of the appdata.

Link to post

Hi everyone,

 

I decided to move Unifi Controller from my Mac to my existing Unraid server and initially had issues connecting to my two AP's.  I figured out it was an port 8080 issue, I think, and changed the port to 8089 in the template.  Setup went smoothly and everything worked perfectly for a week until last night, when one of my two AP's started looping between adopted/disconnected.  Odd thing was that AP was still connected and wifi working, just not connecting to the Controller.  Talked with tech support and we couldn't ssh into either AP and support felt like it was a port issue as well.  So, I migrated back to the Mac Controller and everything worked perfectly.  Then, later today I decided to try a different port (8083) and had a successful migration back to the Unraid Docker and all is running well again for now.  

 

So, just wondering if anyone has experienced this kind of issue or if anyone has any idea what I'm doing wrong?  Why would the initial setup work but then lose connection days later with the AP's?

 

Thanks, Matt

Edited by RodWorks
Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.