DavyV97

Members
  • Posts

    12
  • Joined

  • Last visited

Everything posted by DavyV97

  1. Seems that this solved the issue, I had already tried removing and installing the docker. However, this would not delete the appdata. Removed the docker and renamed the folder, re-installing the docker and importing a back-up solved the issue. Thanks!
  2. The logs shown by UnRaid: [s6-init] making user provided files available at /var/run/s6/etc...exited 0. [s6-init] ensuring user provided files have correct perms...exited 0. [fix-attrs.d] applying ownership & permissions fixes... [fix-attrs.d] done. [cont-init.d] executing container initialization scripts... [cont-init.d] 10-adduser: executing... ------------------------------------- _ () | | ___ _ __ | | / __| | | / \ | | \__ \ | | | () | |_| |___/ |_| \__/ Brought to you by linuxserver.io We gratefully accept donations at: https://www.linuxserver.io/donate/ ------------------------------------- GID/UID ------------------------------------- User uid: 99 User gid: 100 ------------------------------------- [cont-init.d] 10-adduser: exited 0. [cont-init.d] 20-config: executing... [cont-init.d] 20-config: exited 0. [cont-init.d] 30-keygen: executing... [cont-init.d] 30-keygen: exited 0. [cont-init.d] 99-custom-scripts: executing... [custom-init] no custom files found exiting... [cont-init.d] 99-custom-scripts: exited 0. [cont-init.d] done. [services.d] starting services [services.d] done. The server.log shows the following: [2019-10-29 19:46:12,172] <launcher> ERROR db - Got error while connecting to db... com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=127.0.0.1:27117, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}] at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:377) at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:104) at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75) at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71) at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:90) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85) at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55) at com.mongodb.Mongo.execute(Mongo.java:836) at com.mongodb.Mongo$2.execute(Mongo.java:823) at com.mongodb.DB.executeCommand(DB.java:729) at com.mongodb.DB.command(DB.java:491) at com.mongodb.DB.command(DB.java:507) at com.mongodb.DB.command(DB.java:449) at com.ubnt.service.OoOO.W.OÒ0000(Unknown Source) at com.ubnt.service.OoOO.W.afterPropertiesSet(Unknown Source) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1758) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1695) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:573) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.dbService(<generated>) at com.ubnt.service.AppContext.statService(Unknown Source) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$statService$9(<generated>) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>) at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.resolveBeanReference(ConfigurationClassEnhancer.java:392) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:364) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.statService(<generated>) at com.ubnt.service.AppContext.houseKeeper(Unknown Source) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.CGLIB$houseKeeper$17(<generated>) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9$$FastClassBySpringCGLIB$$d0757215.invoke(<generated>) at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361) at com.ubnt.service.AppContext$$EnhancerBySpringCGLIB$$76d57cd9.houseKeeper(<generated>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1247) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:759) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:88) at com.ubnt.service.B.Oo0000(Unknown Source) at com.ubnt.service.B.Õ00000(Unknown Source) at com.ubnt.ace.Launcher.main(Unknown Source) [2019-10-29T19:46:54,193] <localhost-startStop-1> INFO system - ====================================================================== [2019-10-29T19:46:54,200] <localhost-startStop-1> INFO system - UniFi 5.11.50 (build atag_5.11.50_12745 - release/release) is started [2019-10-29T19:46:54,200] <localhost-startStop-1> INFO system - ====================================================================== [2019-10-29T19:46:54,201] <localhost-startStop-1> INFO system - BASE dir:/usr/lib/unifi [2019-10-29T19:46:54,273] <localhost-startStop-1> INFO system - Current System IP: 172.30.32.1 [2019-10-29T19:46:54,273] <localhost-startStop-1> INFO system - Hostname: NAS [2019-10-29T19:56:54,276] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T19:56:54,277] <db-server> WARN db - Unknown error, restarting mongo without logging to verify error [2019-10-29T20:07:09,495] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T20:17:47,545] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T20:27:24,934] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T20:36:54,339] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T20:48:31,788] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T20:59:08,710] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T21:08:54,787] <db-server> WARN db - Mongo start up failed with rc=134 [2019-10-29T21:19:59,955] <db-server> WARN db - Mongo start up failed with rc=134 Apparently those Mongo start up fails have been appearing since 28-09-2019, which might be when I updated. These failures are still occuring to this date. The mongod.log shows this: 2019-10-29T19:56:53.112+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:112068][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14399 through 14414 2019-10-29T19:56:53.156+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:156585][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14400 through 14414 2019-10-29T19:56:53.189+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:189946][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14401 through 14414 2019-10-29T19:56:53.223+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:223168][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14402 through 14414 2019-10-29T19:56:53.278+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:278849][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14403 through 14414 2019-10-29T19:56:53.312+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:312130][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14404 through 14414 2019-10-29T19:56:53.345+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:345406][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14405 through 14414 2019-10-29T19:56:53.378+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:378795][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14406 through 14414 2019-10-29T19:56:53.412+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:412067][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14407 through 14414 2019-10-29T19:56:53.456+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:456668][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14408 through 14414 2019-10-29T19:56:53.490+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:489995][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14409 through 14414 2019-10-29T19:56:53.530+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:530956][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14410 through 14414 2019-10-29T19:56:53.686+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:686403][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14411 through 14414 2019-10-29T19:56:53.853+0100 I STORAGE [initandlisten] WiredTiger message [1572375413:853061][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14412 through 14414 2019-10-29T19:56:54.008+0100 I STORAGE [initandlisten] WiredTiger message [1572375414:8626][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14413 through 14414 2019-10-29T19:56:54.167+0100 I STORAGE [initandlisten] WiredTiger message [1572375414:167723][287:0x14700983ad40], file:index-54-4948728851063389280.wt, txn-recover: Recovering log 14414 through 14414 2019-10-29T19:56:54.247+0100 E STORAGE [initandlisten] WiredTiger error (0) [1572375414:247549][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: read checksum error for 4096B block at offset 4440064: block header checksum of 1413291893 doesn't match expected checksum of 1740413080 2019-10-29T19:56:54.247+0100 E STORAGE [initandlisten] WiredTiger error (0) [1572375414:247658][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: collection-179--1479618200839713417.wt: encountered an illegal file format or internal value 2019-10-29T19:56:54.247+0100 E STORAGE [initandlisten] WiredTiger error (-31804) [1572375414:247719][287:0x14700983ad40], file:collection-179--1479618200839713417.wt, WT_SESSION.checkpoint: the process must exit and restart: WT_PANIC: WiredTiger library panic 2019-10-29T19:56:54.247+0100 I - [initandlisten] Fatal Assertion 28558 at src/mongo/db/storage/wiredtiger/wiredtiger_util.cpp 365 2019-10-29T19:56:54.247+0100 I - [initandlisten] ***aborting after fassert() failure 2019-10-29T19:56:54.272+0100 F - [initandlisten] Got signal: 6 (Aborted). 0x564eb09cbad1 0x564eb09cace9 0x564eb09cb1cd 0x14700843b390 0x147008095428 0x14700809702a 0x564eafc5b5a7 0x564eb06cfff6 0x564eafc65c24 0x564eafc65e49 0x564eafc660ab 0x564eb12d865f 0x564eb12d49ca 0x564eb12d5b93 0x564eb12f5f81 0x564eb139457a 0x564eb13992f2 0x564eb130c9f5 0x564eb13c50ef 0x564eb13c6507 0x564eb13c7559 0x564eb13b4291 0x564eb13cc41a 0x564eb1330da7 0x564eb132921b 0x564eb06b441f 0x564eb06acb12 0x564eb059f750 0x564eafc46463 0x564eafc67496 0x147008080830 0x564eafcc7879 ----- BEGIN BACKTRACE ----- {"backtrace":[{"b":"564EAF418000","o":"15B3AD1","s":"_ZN5mongo15printStackTraceERSo"},{"b":"564EAF418000","o":"15B2CE9"},{"b":"564EAF418000","o":"15B31CD"},{"b":"14700842A000","o":"11390"},{"b":"147008060000","o":"35428","s":"gsignal"},{"b":"147008060000","o":"3702A","s":"abort"},{"b":"564EAF418000","o":"8435A7","s":"_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj"},{"b":"564EAF418000","o":"12B7FF6"},{"b":"564EAF418000","o":"84DC24","s":"__wt_eventv"},{"b":"564EAF418000","o":"84DE49","s":"__wt_err"},{"b":"564EAF418000","o":"84E0AB","s":"__wt_panic"},{"b":"564EAF418000","o":"1EC065F","s":"__wt_block_extlist_read"},{"b":"564EAF418000","o":"1EBC9CA"},{"b":"564EAF418000","o":"1EBDB93","s":"__wt_block_checkpoint"},{"b":"564EAF418000","o":"1EDDF81","s":"__wt_bt_write"},{"b":"564EAF418000","o":"1F7C57A"},{"b":"564EAF418000","o":"1F812F2","s":"__wt_reconcile"},{"b":"564EAF418000","o":"1EF49F5","s":"__wt_cache_op"},{"b":"564EAF418000","o":"1FAD0EF"},{"b":"564EAF418000","o":"1FAE507"},{"b":"564EAF418000","o":"1FAF559","s":"__wt_txn_checkpoint"},{"b":"564EAF418000","o":"1F9C291"},{"b":"564EAF418000","o":"1FB441A","s":"__wt_txn_recover"},{"b":"564EAF418000","o":"1F18DA7","s":"__wt_connection_workers"},{"b":"564EAF418000","o":"1F1121B","s":"wiredtiger_open"},{"b":"564EAF418000","o":"129C41F","s":"_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb"},{"b":"564EAF418000","o":"1294B12"},{"b":"564EAF418000","o":"1187750","s":"_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv"},{"b":"564EAF418000","o":"82E463"},{"b":"564EAF418000","o":"84F496","s":"main"},{"b":"147008060000","o":"20830","s":"__libc_start_main"},{"b":"564EAF418000","o":"8AF879","s":"_start"}],"processInfo":{ "mongodbVersion" : "3.4.23", "gitVersion" : "324017ede1dbb1c9554dd2dceb15f8da3c59d0e8", "compiledModules" : [], "uname" : { "sysname" : "Linux", "release" : "4.19.56-Unraid", "version" : "#1 SMP Tue Jun 25 10:19:34 PDT 2019", "machine" : "x86_64" }, "somap" : [ { "b" : "564EAF418000", "elfType" : 3, "buildId" : "91B53A60D2F6A2BE28D415B74844C8722A21A4FB" }, { "b" : "7FFF7D9FB000", "elfType" : 3, "buildId" : "66C7D3E7CFA7FD6793FA4CD5E237FFD24E2F88F8" }, { "b" : "1470093B7000", "path" : "/lib/x86_64-linux-gnu/libssl.so.1.0.0", "elfType" : 3, "buildId" : "FF69EA60EBE05F2DD689D2B26FC85A73E5FBC3A0" }, { "b" : "147008F72000", "path" : "/lib/x86_64-linux-gnu/libcrypto.so.1.0.0", "elfType" : 3, "buildId" : "15FFEB43278726B025F020862BF51302822A40EC" }, { "b" : "147008D6A000", "path" : "/lib/x86_64-linux-gnu/librt.so.1", "elfType" : 3, "buildId" : "69143E8B39040C964D3958490535322675F15DD3" }, { "b" : "147008B66000", "path" : "/lib/x86_64-linux-gnu/libdl.so.2", "elfType" : 3, "buildId" : "37BFC3D8F7E3B022DAC7943B1A5FACD40CEBF0AD" }, { "b" : "14700885D000", "path" : "/lib/x86_64-linux-gnu/libm.so.6", "elfType" : 3, "buildId" : "BAD67A84E56E73D031AE507261DA066B35949D34" }, { "b" : "147008647000", "path" : "/lib/x86_64-linux-gnu/libgcc_s.so.1", "elfType" : 3, "buildId" : "68220AE2C65D65C1B6AAA12FA6765A6EC2F5F434" }, { "b" : "14700842A000", "path" : "/lib/x86_64-linux-gnu/libpthread.so.0", "elfType" : 3, "buildId" : "B17C21299099640A6D863E423D99265824E7BB16" }, { "b" : "147008060000", "path" : "/lib/x86_64-linux-gnu/libc.so.6", "elfType" : 3, "buildId" : "1CA54A6E0D76188105B12E49FE6B8019BF08803A" }, { "b" : "147009620000", "path" : "/lib64/ld-linux-x86-64.so.2", "elfType" : 3, "buildId" : "C0ADBAD6F9A33944F2B3567C078EC472A1DAE98E" } ] }} mongod(_ZN5mongo15printStackTraceERSo+0x41) [0x564eb09cbad1] mongod(+0x15B2CE9) [0x564eb09cace9] mongod(+0x15B31CD) [0x564eb09cb1cd] libpthread.so.0(+0x11390) [0x14700843b390] libc.so.6(gsignal+0x38) [0x147008095428] libc.so.6(abort+0x16A) [0x14700809702a] mongod(_ZN5mongo32fassertFailedNoTraceWithLocationEiPKcj+0x0) [0x564eafc5b5a7] mongod(+0x12B7FF6) [0x564eb06cfff6] mongod(__wt_eventv+0x3D7) [0x564eafc65c24] mongod(__wt_err+0x9D) [0x564eafc65e49] mongod(__wt_panic+0x2E) [0x564eafc660ab] mongod(__wt_block_extlist_read+0x8F) [0x564eb12d865f] mongod(+0x1EBC9CA) [0x564eb12d49ca] mongod(__wt_block_checkpoint+0x673) [0x564eb12d5b93] mongod(__wt_bt_write+0x4F1) [0x564eb12f5f81] mongod(+0x1F7C57A) [0x564eb139457a] mongod(__wt_reconcile+0x1272) [0x564eb13992f2] mongod(__wt_cache_op+0x875) [0x564eb130c9f5] mongod(+0x1FAD0EF) [0x564eb13c50ef] mongod(+0x1FAE507) [0x564eb13c6507] mongod(__wt_txn_checkpoint+0xD9) [0x564eb13c7559] mongod(+0x1F9C291) [0x564eb13b4291] mongod(__wt_txn_recover+0x5FA) [0x564eb13cc41a] mongod(__wt_connection_workers+0x37) [0x564eb1330da7] mongod(wiredtiger_open+0x197B) [0x564eb132921b] mongod(_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceES8_mbbbb+0x70F) [0x564eb06b441f] mongod(+0x1294B12) [0x564eb06acb12] mongod(_ZN5mongo20ServiceContextMongoD29initializeGlobalStorageEngineEv+0x6B0) [0x564eb059f750] mongod(+0x82E463) [0x564eafc46463] mongod(main+0x966) [0x564eafc67496] libc.so.6(__libc_start_main+0xF0) [0x147008080830] mongod(_start+0x29) [0x564eafcc7879] ----- END BACKTRACE ----- I am using the :latest version of the Unifi Controller on UnRaid 6.7.2 and tried connecting using both Chrome, Firefox and even Edge. This using Host and br0 configurations. Host configuration has always worked for me.
  3. Somehow I cannot access the WebUI after using the latest update. I have tried changing the network settings between Host and br0, none of which gave me luck. Does anyone know if this is a known bug and whether or not this will be fixed? Currently I cannot access my configuration which is really unfortunate.
  4. I have succesfully installed the LMS docker and it works great, however when playing music from the server to a client (using Squeezelite-X on Windows 10) the server randomly stops playing in the middel of a song for about 5 seconds before continuing. It happens with multiple song on different occassions, server log shows nothing. Currently happens every few minutes and is really annoying. When listening to a live stream (TuneIn for example) this problem does not occur. Tried other clients which all had the same problem so it seems to be on the server side. Anyone had this problem? Or might know the solution?
  5. Damn, I'm sorry, I copied the wrong file. This is the right one: remote xxxxxxx.xxxx.xxx cipher AES-256-CBC auth sha512 client dev tun proto udp port 1194 resolv-retry infinite tls-client nobind persist-key persist-tun remote-cert-tls server tls-cipher TLS-DHE-RSA-WITH-AES-256-GCM-SHA384:TLS-DHE-RSA-WITH-AES-128-GCM-SHA256:TLS-DHE-RSA-WITH-AES-256-CBC-SHA:TLS-DHE-RSA-WITH-CAMELLIA-256-CBC-SHA:TLS-DHE-RSA-WITH-AES-128-CBC-SHA:TLS-DHE-RSA-WITH-CAMELLIA-128-CBC-SHA comp-lzo adaptive verb 3 route-delay 2 <ca> Still the same problem as before, can connect/ping to the ip's but not the names.
  6. The config file: # openvpnserver plugin configuration file NETWORK=10.8.0.0 NETMASK=255.255.255.0 SERVER_PORT=1194 CANONICAL=xxxxx.xxxxx.xxx PROTOCOL=udp CIPHER="cipher AES-256-CBC" CLIENT="Enable" HASH_ALGO="auth sha512" GATEWAY="redirect-gateway def1" SUBNET="topology subnet" LAN_SUBNET="Disable" COMP_LZO="comp-lzo adaptive" IPP="ipp.txt" DHCP_1="dhcp-option DNS" TELNET_CONSOLE="No" VERB="verb 3" IP_PORT_SHARE="" TLSENCRYPT="tls-crypt" .ovpn file (parts of): # Define the profile name of this particular configuration file # OVPN_ACCESS_SERVER_PROFILE=xxxxxxx.xxxx.xxx # OVPN_ACCESS_SERVER_CLI_PREF_ALLOW_WEB_IMPORT=True # OVPN_ACCESS_SERVER_CLI_PREF_BASIC_CLIENT=False # OVPN_ACCESS_SERVER_CLI_PREF_ENABLE_CONNECT=True # OVPN_ACCESS_SERVER_CLI_PREF_ENABLE_XD_PROXY=True # OVPN_ACCESS_SERVER_WSHOST=xxxxxx.xxxxxx.xxx:943 # OVPN_ACCESS_SERVER_WEB_CA_BUNDLE_START And: # OVPN_ACCESS_SERVER_WEB_CA_BUNDLE_STOP # OVPN_ACCESS_SERVER_IS_OPENVPN_WEB_CA=1 # OVPN_ACCESS_SERVER_ORGANIZATION=OpenVPN Technologies, Inc. setenv FORWARD_COMPATIBLE 1 client proto udp nobind remote xxxx.xxxx.xxx port 1194 dev tun dev-type tun ns-cert-type server setenv opt tls-version-min 1.0 or-highest reneg-sec 604800 sndbuf 100000 rcvbuf 100000 auth-user-pass # NOTE: LZO commands are pushed by the Access Server at connect time. # NOTE: The below line doesn't disable LZO. comp-lzo no verb 3 setenv PUSH_PEER_INFO
  7. So these are the settings I get when connecting the vpn on my laptop (see attached file). Where can I find the openvpnserver.conf file?
  8. I've got to say this is a really good plugin, I really like it. I was just wondering if there is a setting to see the device names on the pc connected via the VPN. I map my network drives using names, for instance '\\NAS\documents' and us RDP using the names and not the ip's. However, currently the folder won't be recognized this way, only by using the ip. I have been trying some different settings but I haven't had luck really. Maybe this question has been asked earlier but I couldn't really find it.
  9. So, I have installed a RX550. I chose this card since it would fit in the server and wasn't too expensive. Plus, RemoteFX worked with my RX480 so I'd reckon this should work with the RX550 too. Unfortunately when starting the VM after assigning the GPU, the server crashes. When both VNC and dedicated GPU selected I can see through VNC it boots the Windows logo but stops there. Looking at the console I see this error the moment the VM starts: Uhhuh. NMI received for unknown reason 2d on CPU 0. Do you have a strange power saving mode enabled? Dazed and confused, but trying to continue Even when immediately stopping the VM, the server will still crash (just power down really) and reboot. I have tried different settings (PCIe ACS override), tweaking the BIOS, set CPU to Emulated instead of passthrough. Though, nothing helped this far. An option would be to install Windows directly on the server to use RemoteFX, but with this I would lose the data protection UnRaid offers. Has anybody ever experienced the problem I currently have? (and found the problem?) Would there be other ways to use the data protection of UnRaid but use RemoteFX?
  10. The server hasn't got dedicated graphics, just a little graphics chip on the motherboard for the onboard graphics (low resolution). This is probably why RDP works at 1920x1080 while NoMachine works on 800x600, so probably no way to fix this. Maybe it is possible to run different VM's within Windows 10 using Hyper-V and use RemoteFX vGPU to share a dedicated GPU (which I'll have to install obviously). This way they will share the GPU and be able to run at higher resolutions using NoMachine??? I don't think UnRaid has a feature that let's different VM's share the same GPU?
  11. Thanks for the alternatives, I have already been looking at SplashTop but since it's a paid service I have put that option on hold. Have been testing wit NoMachine but unfortunately it relies on the Microsoft Basic Display Adapter which pins my resolution on 800x600. Any other alternatives?
  12. I currently have an IBM server with a Intel® Xeon® CPU E7- 2870, 128 GB RAM and some other hardware. I want to use this for running multiple Windows 10 VM's as remote workstations, I still have to work some things out though. I would like to know your opinions on remote accessing the VM's, I've heard good things about RDP from jonp but it doesn't seem to work smoothly for me. Especially when watching a video or something it really starts to stutter (I'm on a 1 Gbps LAN connection). I currently have 2x 250GB SSD's installed for running the VM's on. Is this data also stored on the array (and thus protected by the parity disk)? Or should I always manually back it up? Also, when installing the first VM I left the disk space at 30G (standard) thinking I could upgrade it later. Though I can't seem to find the option anymore. I know most of these items have probably been discussed before and I've looked around at the forum but I haven't found an answer really. Specs: Model: Custom M/B: IBM - Node 1 Memory Card CPU: Intel® Xeon® CPU E7- 2870 @ 2.40GHz HVM: Enabled IOMMU: Enabled Cache: 32 kB, 256 kB, 30720 kB Memory: 128 GB (max. installable capacity 128 GB)* Network: bond0: fault-tolerance (active-backup), mtu 1500 eth0: 1000 Mb/s, full duplex, mtu 1500 eth1: not connected Kernel: Linux 4.9.30-unRAID x86_64 OpenSSL: 1.0.2k Hopefully you guys can help me with this.