I've been working on a proof of concept app that leverages two-way communication between Objective C (iOS 7) and JavaScript using the WebKit JavaScriptCore framework. I was finally able to get it working as expected, but have run into a situation where the UIWebView loses its reference to the iOS object that I've created via JSContext.
The app is a bit complex, here are the basics:
- I'm running a web server on the iOS device (CocoaHTTPServer)
- The UIWebView initially loads a remote URL, and is later redirected back to
localhost
as part of the app flow (think OAuth) - The HTML page that the app hosts (at localhost) has the JavaScript that should be talking to my iOS code
Here's the iOS side, my ViewController's .h
:
#import <UIKit/UIKit.h>
#import <JavaScriptCore/JavaScriptCore.h>
// These methods will be exposed to JS
@protocol DemoJSExports <JSExport>
-(void)jsLog:(NSString*)msg;
@end
@interface Demo : UIViewController <UserInfoJSExports, UIWebViewDelegate>
@property (nonatomic, readwrite, strong) JSContext *js;
@property (strong, nonatomic) IBOutlet UIWebView *webView;
@end
And the pertinent parts of the ViewController's .m
:
-(void)viewDidLoad {
[super viewDidLoad];
// Retrieve and initialize our JS context
NSLog(@"Initializing JavaScript context");
self.js = [self.webView valueForKeyPath:@"documentView.webView.mainFrame.javaScriptContext"];
// Provide an object for JS to access our exported methods by
self.js[@"ios"] = self;
// Additional UIWebView setup done here...
}
// Allow JavaScript to log to the Xcode console
-(void)jsLog(str) {
NSLog(@"JavaScript: %@", str);
}
Here is the (simplified for the sake of this question) HTML/JS side:
<html>
<head>
<title>Demo</title>
<script type="text/javascript">
function setContent(c, noLog){
with(document){
open();
write('<p>' + c + '</p>');
close();
}
// Write content to Xcode console
noLog || ios.jsLog(c);
}
</script>
</head>
<body onload="javascript:setContent('ios is: ' + typeof ios)">
</body>
</html>
Now, in almost all cases this works beautifully, I see ios is: object
both in the UIWebView and in Xcode's console. Very cool. But in one particular scenario, 100% of the time, this fails after a certain number of redirects in the UIWebView, and once the above page finally loads it says:
ios is: undefined
...and the rest of the JS logic quits because the subsequent call to ios.jsLog
in the setContent
function results in an undefined object exception.
So finally my question: what could/can cause a JSContext to be lost? I dug through the "documentation" in the JavaScriptCore's .h files and found that the only way this is supposed to happen is if there are no more strong
references to the JSContext
, but in my case I have one of my own, so that doesn't seem right.
My only other hypothesis is that it has to do with the way in which I'm acquiring the JSContext
reference:
self.js = [self.webView valueForKeyPath:@"documentView.webView.mainFrame.javaScriptContext"];
I'm aware that this may not be officially supported by Apple, although I did find at least one SO'er that said he had an Apple-approved app that used that very method.
EDIT
I should mention, I implemented UIWebViewDelegate
to check the JSContext after each redirect in the UIWebView thusly:
-(void)webViewDidFinishLoad:(UIWebView *)view{
// Write to Xcode console via our JSContent - is it still valid?
[self.js evaluateScript:@"ios.jsLog('Can see JS from obj c');"];
}
This works in all cases, even when my web page finally loads and reports ios is: undefined
the above method simultaneously writes Can see JS from obj c
to the Xcode console. This would seem to indicate the JSContext is still valid, and that for some reason it's simply no longer visible from JS.
Apologies for the very long-winded question, there is so little documentation on this out there that I figured the more info I could provide, the better.