What you are discovering here is a long debated argument over the performance of Windows vs Unix based operating systems. If you just do a couple of google searches for 'Windows vs Linux/OS X in <application or task>' you will find countless discussions going back up to the early 2000's talking about this.
I won't get too deep into it but the thing to understand is that Microsoft, compared to the Linux community and Apple, did not make Windows the most resource efficient OS on the planet. Comparatively, besides gaming (where they hold the crown), Windows has been shown to suffer a bit when used for rendering and simulating workloads, due to it's poorer implementation of memory management (especially when hitting swap or paging), disk IO, and thrown away CPU cycles. I'm not saying it doesn't perform well, it does, but for strictly heavy CPU/RAM/Disk tasks it's not quite the same.
Some numbers that usually get thrown around is about a 10% - 30% increase in render speed when using Unix systems, but keep in mind this really only starts to show on long term renders. IMO, that's anything above 30 minutes (especially ones going into the range of hours). The difference you see below 10 minutes should be relatively similar, as you have seen in yours. Besides system administration and pipeline automation, it's a (depending on the studio) minor/major reason why larger studio farms typically run Linux.
As previously mentioned, Arnold is currently a CPU render engine. They are working on a sibling GPU engine, which they demoed late last year (Siggraph was it?). We have yet to see whether it will become a production capable engine, or fall the way of VRay RT and be more for look-dev. I may suggest (at least for your Windows machine) looking into production GPU rendering engines, such as Redshift or Octane.