[HN Gopher] Rendering Moana with Swift
       ___________________________________________________________________
        
       Rendering Moana with Swift
        
       Author : hutattedonmyarm
       Score  : 108 points
       Date   : 2021-01-14 17:55 UTC (5 hours ago)
        
 (HTM) web link (gonsoloblog.wordpress.com)
 (TXT) w3m dump (gonsoloblog.wordpress.com)
        
       | liuliu wrote:
       | Saw this earlier in Swift forum too. It demonstrate that
       | performant Swift code is possible, but with a lot of profiling. I
       | initially choose to use Swift in hope of better performance
       | characteristics than Python (for some pretty heavy data
       | preprocessing prior feeding to neural nets), but at the end,
       | writing some simple code still 2~3x slower than in C / C++ if not
       | more.
       | 
       | Hopefully at some point when Swift matures, people can start to
       | optimize the generated code more.
        
         | Longhanks wrote:
         | So your initial idea was to replace Python with a faster
         | language, you then did that by rewriting the code in Swift, but
         | are eventually disappointed that the code is slower than C/C++?
         | Why not write C/C++ in the first place if you need its
         | performance?
         | 
         | It is pretty well known that Swift, an inherently safer/easier
         | language than C/C++ that does automatic reference counting
         | cannot compete with fine-tuned C/C++.
         | 
         | Did you not achieve your initial goal of getting something that
         | is faster than Python?
        
         | [deleted]
        
         | gilgoomesh wrote:
         | I'm not sure what you were trying to do but Swift can match C
         | quite easily. But the standard library isn't optimised for that
         | use case, so you need to avoid most of it.
         | 
         | Use `UnsafeBuffer` instead of the standard library `String` or
         | `Array`, unsafe mathematical operators (`&*` instead of checked
         | multiplication `*`) and use neither `class` nor existentials
         | (because they involve reference counting and heap allocation).
         | 
         | That leaves you with the same language constructs as C and LLVM
         | compiles it to the same instructions.
        
           | liuliu wrote:
           | Yeah, of course you can. But at that point, it stops being
           | the idiomatic Swift language. My original post is not a
           | critic for the slowness of the language. It does (like the
           | other post mentioned) better than Python which is what I set
           | out to do.
           | 
           | I do think Swift can do better though, especially with
           | ownership so that most of refcount'ed classes can be migrated
           | to struct and still retain most of the easiness when just use
           | these. The standard library probably can be optimized further
           | so the idiomatic array can be sufficient for the author's
           | case when it can etc etc.
        
       | pjmlp wrote:
       | Great to see new rendering stacks being used.
        
         | bartvk wrote:
         | I also like to see Swift outside iOS. It was an interesting
         | contest though, which came down to personal preference: "in the
         | end only Rust and Swift were serious contenders. I finally
         | chose Swift because of readability"
        
       | teamspirit wrote:
       | Am I missing something when the author writes about a free GCP
       | instance with 8 vcpus & 64GB ram? Which one is that?
       | 
       | My second thought, why not scale that up to 64 vcpus (or even
       | more) and spend less time rendering? I'm sure there's a balance
       | between cost per second of render time and # vcpus. I haven't
       | found it myself, mostly due to not experimenting.
       | 
       | When I render my product shots on GCP, I use a 96 vcpu instance
       | and render relatively intensive (due to the higher quality of
       | render and light settings) at print resolution in a minute or
       | two. The cost becoming negligible, the feedback quick, and the
       | strain on my MBP minimal.
        
         | niea_11 wrote:
         | I think he's using GCP with a new account, so he received a
         | $300 free credit.
         | 
         | But I couldn't find machines with 8vcpus and 64gb, only 8vcpus
         | with 32gb
         | 
         |  _90-day, $300 Free Trial: New Google Cloud and Google Maps
         | Platform users can take advantage of a 90-day trial period that
         | includes $300 in free Cloud Billing credits to explore and
         | evaluate Google Cloud and Google Maps Platform products and
         | services. You can use these credits toward one or a combination
         | of products._
         | 
         | https://cloud.google.com/free/docs/gcp-free-tier#free-tier-u...
        
       | brundolf wrote:
       | This is an interesting aside:
       | 
       | > GPU rendering: This should be a big one, PBRT-v4 obviously does
       | this as some of the mentioned renderers above. It should be very
       | well possible to follow them and use Optix to render on a
       | graphics card but I would much prefer a solution not involving
       | closed source. Which would mean that you have to implement your
       | own Optix. :\ But looking at how CPUs and GPUs are evolving it
       | might be possible in a distant future to use the same (Swift)
       | code on both of them; you can have instances with 448 CPUs in the
       | cloud and the latest GPUs have a few thousand micro-cpus, they
       | look more and more the same.
       | 
       | I'd be really excited for this future. One of the main reasons I
       | haven't delved into GPU programming for fun is that I don't
       | really like the available language(s). It seems petty, but the
       | language I'm using makes a huge difference in the fun-factor for
       | me.
        
         | dahart wrote:
         | Try glsl, like on shadertoy.com, for a high fun-factor intro to
         | GPU programming.
         | 
         | I enjoy coding in Python and JavaScript more than I enjoy
         | coding in C++ or CUDA, if I'm looking only at the language
         | itself, but I have to admit that it's also very fun to make
         | something run 100 or 1000 times faster than it did before. That
         | kind of fun helps me overlook language differences.
         | 
         | To me the quote sounds pretty funny, because a cloud of 500
         | CPUs running Swift, right now, is _way_ more expensive and
         | _way_ less efficient than a single GPU. The current generation
         | of GPU have over 10k single-thread cores...
        
       | Scoundreller wrote:
       | Aka Vaiana in many European countries to, ostensibly avoid
       | "trademark" issues, but realistically to avoid unexpected Google
       | confusion with Moana the adult film star.
       | 
       | https://www.hollywoodreporter.com/news/disney-changes-moana-...
        
       | johnnythunder wrote:
       | Does anyone know of a project to convert this to an explorable d
       | map in a game engine? It seems that it would be possible by
       | simplifying the model and rendering it in real-time.
        
         | brundolf wrote:
         | It looks like Unreal has some amount of direct support:
         | https://docs.unrealengine.com/en-US/WorkingWithContent/USDin...
        
       | timClicks wrote:
       | > I finally chose Swift because of readability (I just don't like
       | ,,fn main" of ,,impl trait").
       | 
       | And who says that syntax doesn't matter?
        
         | richardwhiuk wrote:
         | I think the alternatives in swift are `func` and `extension`
         | (not sure on the last one - by Swift is a bit rusty).
        
           | jjtheblunt wrote:
           | Swift being "rusty" made me laugh to myself quietly.
        
         | dimitrios1 wrote:
         | > And who says that syntax doesn't matter?
         | 
         | Literally noone.
        
       ___________________________________________________________________
       (page generated 2021-01-14 23:00 UTC)