Team Infrastructure
We keep the networking lights blinking
In Space Hardware
We currenly have two locations for networking hardware, the server rack in the craft area, and the networking rack in the main space. The two areas are connected with a pair of LAGG'd ethernet cables as backhaul.
Switch configs are backed up (TODO link?) and we are working on improving the deployment of our infrastructure so we that we can have hot provisioning on the various machines around the hackspace
Craft Area Rack
This is where all the spaces servers live and operate, as well as the majority of our switches, and our main network bottleneck
Networking
For networking we currently have 3 cisco switches currently serving the roles of:
- Internet facing link - this can be removed ideally if possible
- Main POE - this is limited to 10/100 and needs replacing
- Craft area - provides client access in craft area
TODO: Add switch types and other data; maybe tabulate
There are cable terminated into a patch panel in this rack which cover various areas of the space
The internet incoming feed is also terminated here too, it is a VDSL link provided to the space by Andrews and Arnolds
Servers
We currently have 3 Dell R210ii rack servers and one HP mini server. We currently do not have redundancy despite running 3 servers, this would ideally be fixed if at all possible and we have plans to do so; possibly pending a server upgrade
VM01 - R210ii
VM01 is the vmhost for the space and controls the majority of space
It runs the VMs:
- netservices - Runs networking
- netadmin - Runs network admin tools
- iotservices - runs the smart space kit such as openbot
Store - HP MicroServer N54L
Runs network storage for hackspace members to share files within the hackspace
Power
We currently have an APC smart PDU but it in need of reconfiguring and is not currently on the network
It should be able to provide power monitoring
Main Space Rack
This is located behind and to the left of the projector screen
Networking
This used to contain 2 cisco switches (one POEm one none) but was migrated to a single POE 1Gbe switch. There is a lagged pair of ethernet links back to the rack in main space
This switch is a Cisco Catalyst C2960S-48FPS-L with 740W of POE+ and an expected power consumption at idle power draw likely around 78W before POE draws.
There is a patch panel terminating networking cables from wall areas around the space, as well as direct cable feeds to the front of space. The networking sockets in the electronics bench trunking have been removed and need new cables pulling and re-installing, so some of the cables likely need pulling and replacing.
Power
There is a PDU here of some kind but it is somewhat redundant
In Space Client Hardware
We operate various kit to ensure members can connect to our network and use the space
Wireless Networking
We operate the space on TP-Link Omada wireless APs.
These are placed with the intention of providing full coverage of the space
They are currenty deployed in the locations:
- Main space, above the D&D table
- Craft area, attached to the server rack
- Back corridoor, between the darkroom and the toilets
- Outdoors, above door next to the sign
Darkspots
The following areas are potential darkspots to fix as APs are relocated
- Far end of toilets (do we care?)
- Kitchen? (a tad annoying)
Client Desktops
Currently maintained client machines are:
- 3 workstations under the clock - upgrades pending
- 3d printer PC
- Laser cutter PC
Client Laptops
A shitshow that needs sorting
Off Site Hardware
Hackspace web services are operating on a box provided by HeartInternet
It has 8GB of ram, 4vCPUs, and 200GB of disk.
Web resources
We run various web resources to provide operations to the space
TODO document more, clarify in space vs out of space
Main Website
URLs: leedshackspace.org.uk leeds.hackspace.org.uk
This is our main website, and also has a blog. This is running Wordpress
Wiki
URLs: wiki.leedshackspace.org.uk
This wiki you are on right now. Information is available on the version page
Improvement Plans
TODO add more
- Redundancy in space
- Hardware upgrades
- Maintnance guides
- Acquire more maintainers
- Ideally have enough space for members to run/play with hardware/learning/etc. and run workshops