I'm creating a game with lots of screens with a lot of animations using Flash CS6. That's why I decided to use SOME vector graphics to decrease app size and fit below 50mb. It's working on full fps on android and desktop, but I've got optimalization problems on iOS (testing on iPad 2). Let me explain simple scenario: I've got a menu screen and 2 other screens (let's name them A and B). When I'm in menu I have max fps (24), when I go to A I got 24 fps too. But when I go back to menu and go to B I'm getting huge fps drop (3 fps). But if I kill app, go to menu and straight to B I'm getting max fps.
Menu - 24 fps -> A - 24 fps -> Menu - 24 fps -> B - 3 fps
Menu - 24 fps -> B - 24 fps -> Menu - 24 fps -> A - 24 fps -> Menu 24 fps -> B - 3 fps
however:
Menu - 24 fps -> B - 24 fps -> Menu - 24 fps -> B - 24 fps
So it looks like screen A is leaking memory. However I've commented all code, so I'm not playing any animations etc. Also cacheAsBitmap leads later to fps drop (and also leaves jagged edges) in screen B. Only if I select Export as bitmap in animating objects in screen A it won't slow down screen B.
I'm removing all listeners, timelines (greensock), timers (setTimeout), sound channels (channel.stop()), setting objects to null and even tried calling System.gc().
Also important: I'm using GPU rendering, not direct. Direct kills performance on emitters etc. (direct is killing ipad 2 retina for example).
I'm really stuck at this. Only solution is to convert everyting to bitmaps, but app will be huge.