I have a scene that has many objects, that all have different textures. For some reason 2 objects have a red hue to them, even though their textures have no red. You can still see the pattern in the texture, it just has different shades on red. (On the simulator the 2 objects have black and white textures and on device shades of red) Does anyone have any idea why this is happening? Other objects are working fine.
For material properties such as metalness
and roughness
SceneKit has support for 1 channel (grayscale) images. In order to save memory these images are kept as grayscale textures, they are not converted to RGB textures that have the same data in each channel.
When you use such an image for a "coloured" material property (such as diffuse
), SceneKit will ask for the red, green and blue components of the sample but green and blue will always be 0 and the image will appear red.
One unfortunate solution consists in reworking your texture in an image editing app so that it's saved as RBG instead of grayscale.
You can also try shader modifiers to convert from grayscale to RGB:
_surface.diffuse.rgb = _surface.diffuse.rrr;
Edit
Starting iOS 11 you can use the textureComponents
property when using single channel textures:
material.diffuse.textureComponents = .red
I've had the same problem with loading (grey) .png images as texture for a SCNMaterial
.
On the simulator it rendered fine. But on the device all images with only black, white or grey, renders with a red base color.
let texture = SCNMaterial()
texture.diffuse.contents = UIImage(named: "rgb image with only gray")
I solved this problem by creating a new UIImage
from the given UIImage
:
let texture = SCNMaterial()
var newImage: UIImage?
if let textureImage = UIImage(named: "rgb image with only gray") {
UIGraphicsBeginImageContext(textureImage.size)
let width = textureImage.size.width
let height = textureImage.size.height
textureImage.draw(in: CGRect(x: 0, y: 0, width: width, height: height))
newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
texture.diffuse.contents = newImage
HTH
I had the same problem with an indexed-color png texture.
What I did for solving the problem was to convert the indexed-color png into the truecolor png.
It might be an NSImage bug. When I used CGImage to load the png file, it didn't cause any problem.
let url = URL(fileURLWithPath: pngFileName, relativeTo: directoryPath)
// let reddishImage = NSImage(contentsOf: url)
let cgDataProvider = CGDataProvider(url: url as CFURL)!
let cgImage = CGImage(pngDataProviderSource: cgDataProvider, decode: nil, shouldInterpolate: false, intent: CGColorRenderingIntent.defaultIntent)!
let imageSize = CGSize(width: cgImage.width, height: cgImage.height)
let correctImage = NSImage(cgImage: cgImage, size: imageSize)
© 2022 - 2024 — McMap. All rights reserved.