1. Overview
  2. Library linking
  3. Implementing GL support
    3.1. Creating context
    3.2. Moving to fPKG
    3.3. Fixing video memory issue
    3.4. Resolving runtime shader compilation issue
  4. Final words


PS4 have an OpenGL ES implementation which is called Piglet and it's used in shell's UI rendering and webkit. Zer0xFF and masterzorag were working on making it usable for homebrew applications but then got stucked with native shader compilation. So I've decided to give it a try.

However I have experienced other problems as well, which were solved successfully. Basically they appeared because I decided to implement OpenGL ES application as my own PKG based application. But OpenGL ES in browser didn't work for me too and I don't know if it's because I've used a more recent firmware than guys did or because I did some dumb mistake. So, nevermind, let's begin.

Library linking

To be able to use libraries (including OpenGL ES libraries) which are available in the firmware but not accessible from SDK, we need to load them manually and then resolve needed functions. Some developers just load libraries by themselves and do resolving of all required functions via dlsym-like feature in runtime. I've decided to generate a stub library instead and link it to my application.

There are a few ways to do it, for example, for Remote Package Installer I've used macroses to make fake C function definitions that gets prefix __declspec(dllexport) during compilation but the actual code is not included in an object file due to usage of --stub-only flag during linking process. Then linker generates proper NID tables in a static library which are resolved during runtime linking. Also it was possible to specify custom function name with some specific NID value if the actual name of function is not known. Later I've modified this method a bit by removing redundant function declaration from source files and having only header files with macro calls that generates both function declarations and definitions.

But I didn't like this method anyway. Mostly because it requires you to change function declarations in header files (and maybe even in source files) to make them use of macro calls and if you're working on some big library it's just annoying (even if you use code generation like I do with OpenGL ES libraries).

So I've decided to implement a new way to do it. A toolchain contains a feature called EMD files, they are plain text files that describes what library contains which functions. So you could just create a list of functions that will be marked in the same way as __declspec(dllexport) does, and then you link this file using orbis-ld linker as well as your object files (see below). Here's an example of .emd file:

Library: lib<name> {
	export: {
		<func name 1>
		<func name 2>
		<obj name 1>
		<obj name 2>

As you may see you could also specify library name, it's very useful if single module contains multiple libraries, for example, libSceSystemService.sprx contains libSceSystemService and libSceLncUtil libraries.

To link using .emd files you still need object files too, so you need to compile code of these functions (even if they are dummy ones). So I've wrote a trivial script on Python that reads simple text file with a list of names of functions/objects that needs to be exported, creates .emd files for each needed library and .c/.S source files for them, where assembly files contains dummy objects and functions and .c files contains TLS variables (it's possible to move anything to .c files but you may have some warnings/errors when trying to compile some special stuff like builtin functions). Makefile then combines them all to produce stub libraries which could be used later in your application. You could find it here: stub lib maker v2

Implementing GL support

Creating context

I've started by looking into what guys (the ones I did mention before) already did (we're working with a payload that runs from the browser currently).

PS4 OpenGL ES2 -- Hardware Accelerated Graphics Rendering WIP
gles2 branch of ps4sdk's fork

First, we need to take header files from Khronos, I won't describe on how to prepare them here but if want to know, just look into the repository. Be sure that it's needed to do some modifications to the source code to make it compatible with PS4 platform.

struct _SceWindow {
	uint32_t id;
	uint32_t width;
	uint32_t height;
typedef struct _SceWindow SceWindow;

typedef int EGLNativeDisplayType;
typedef void *EGLNativePixmapType;
typedef SceWindow *EGLNativeWindowType;

Also, need to set define some macroses:

#elif defined(__ORBIS__)


Now we need to do platform's initialization. Here it's done by using a special function in Piglet library: bool scePigletSetConfigurationVSH(const ScePglConfig* config). It uses provided configuration to set up internal memory buffers, display parameters, command buffers, etc. But an actual format of config struct is unknown, so I need to try to determine it by myself by doing reverse-engineering of system applications. One of scePigletSetConfigurationVSH clients is located in system/common/lib/WebBrowserUIProcess.sprx.

scePigletSetConfigurationVSH call

As you may see just before this call there's a call to the function that does something with .ini file for Piglet and it takes some callback function as an argument and it's INI parser obviously. Let's see what's inside callback function:

INI parser callback

Bingo! There are some parameters that gets parsed from these files, so we could find where they store their values. Others parameters are not handled here, so I've spent some time to determine what they're doing by looking into the code. However I didn't find all fields but it's okay because most of them are not important for us and we could omit them.

	TYPE_FIELD(uint32_t size, 0x00);
	TYPE_FIELD(uint32_t flags, 0x04);
	TYPE_FIELD(uint8_t processOrder, 0x08);
	TYPE_FIELD(uint32_t unk_0x0C, 0x0C);
	TYPE_FIELD(uint32_t unk_0x10, 0x10);
	TYPE_FIELD(uint32_t unk_0x14, 0x14);
	TYPE_FIELD(uint64_t systemSharedMemorySize, 0x18);

	TYPE_FIELD(uint32_t unk_0x20, 0x20);
	TYPE_FIELD(uint32_t unk_0x24, 0x24);
	TYPE_FIELD(uint64_t videoSharedMemorySize, 0x28);
	TYPE_FIELD(uint64_t maxMappedFlexibleMemory, 0x30);
	TYPE_FIELD(uint64_t minFlexibleMemoryChunkSize, 0x38);

	/* TODO: sets boundaries of debug window? see sceCompositorSetDebugPositionCommand((uint8_t)unk_0x50, (uint16_t)unk_0x48, (uint16_t)unk_0x4C, (uint16_t)unk_0x40, (uint16_t)unk_0x44) */
	TYPE_FIELD(uint32_t dbgPosCmd_0x40, 0x40);
	TYPE_FIELD(uint32_t dbgPosCmd_0x44, 0x44);
	TYPE_FIELD(uint32_t dbgPosCmd_0x48, 0x48);
	TYPE_FIELD(uint32_t dbgPosCmd_0x4C, 0x4C);
	TYPE_FIELD(uint8_t dbgPosCmd_0x50, 0x50);

	TYPE_FIELD(uint32_t drawCommandBufferSize, 0x54);
	TYPE_FIELD(uint32_t lcueResourceBufferSize, 0x58);

	TYPE_FIELD(uint32_t unk_0x5C, 0x5C);

	TYPE_FIELD(uint64_t unk_0x60, 0x60);
	TYPE_FIELD(uint64_t unk_0x68, 0x68);
	TYPE_FIELD(uint64_t unk_0x70, 0x70);
	TYPE_FIELD(uint64_t unk_0x78, 0x78);
typedef struct _ScePglConfig ScePglConfig;

Now it's a time to fill it and try to set up Piglet. I've grabbed parameters from one of system applications.


	ScePglConfig pgl_config;
	SceWindow render_window = { 0, width, height };
	EGLDisplay display = EGL_NO_DISPLAY;
	EGLConfig config = NULL;
	EGLint num_configs;
	EGLSurface surface = EGL_NO_SURFACE;
	EGLContext context = EGL_NO_CONTEXT;

	EGLint attribs[] = {

	EGLint ctx_attribs[] = {

	EGLint window_attribs[] = {


	memset(&pgl_config, 0, sizeof(pgl_config));
		pgl_config.size = sizeof(pgl_config);
#if 1
		pgl_config.processOrder = 1;
		pgl_config.systemSharedMemorySize = 0x200000;
		pgl_config.videoSharedMemorySize = 0x2400000;
		pgl_config.maxMappedFlexibleMemory = 0xAA00000;
		pgl_config.drawCommandBufferSize = 0xC0000;
		pgl_config.lcueResourceBufferSize = 0x10000;
		pgl_config.dbgPosCmd_0x40 = 1920;
		pgl_config.dbgPosCmd_0x44 = 1080;
		pgl_config.dbgPosCmd_0x48 = 0;
		pgl_config.dbgPosCmd_0x4C = 0;
		pgl_config.unk_0x5C = 2;
#elif 1
		pgl_config.flags = SCE_PGL_FLAGS_USE_COMPOSITE_EXT;
		pgl_config.processOrder = 1;
		pgl_config.systemSharedMemorySize = 0x2000000;
		pgl_config.videoSharedMemorySize = 0xA000000;
		pgl_config.unk_0x5C = 2;

	if (!scePigletSetConfigurationVSH(&pgl_config)) {
		EPRINTF("scePigletSetConfigurationVSH failed.\n");
		goto err;

	display = eglGetDisplay(EGL_DEFAULT_DISPLAY);
	if (display == EGL_NO_DISPLAY) {
		EPRINTF("eglGetDisplay failed.\n");
		goto err;

	if (!eglInitialize(display, &major, &minor)) {
		ret = eglGetError();
		EPRINTF("eglInitialize failed: 0x%08X\n", ret);
		goto err;
	printf("EGL version major:%d, minor:%d\n", major, minor);

	if (!eglBindAPI(EGL_OPENGL_ES_API)) {
		ret = eglGetError();
		EPRINTF("eglBindAPI failed: 0x%08X\n", ret);
		goto err;

	if (!eglSwapInterval(display, 0)) {
		ret = eglGetError();
		EPRINTF("eglSwapInterval failed: 0x%08X\n", ret);
		goto err;

	if (!eglChooseConfig(display, attribs, &config, 1, &num_configs)) {
		ret = eglGetError();
		EPRINTF("eglChooseConfig failed: 0x%08X\n", ret);
		goto err;
	if (num_configs != 1) {
		EPRINTF("No available configuration found.\n");
		goto err;

	surface = eglCreateWindowSurface(display, config, &render_window, window_attribs);
	if (surface == EGL_NO_SURFACE) {
		ret = eglGetError();
		EPRINTF("eglCreateWindowSurface failed: 0x%08X\n", ret);
		goto err;

	context = eglCreateContext(display, config, EGL_NO_CONTEXT, ctx_attribs);
	if (context == EGL_NO_CONTEXT) {
		ret = eglGetError();
		EPRINTF("eglCreateContext failed: 0x%08X\n", ret);
		goto err;

	if (!eglMakeCurrent(display, surface, surface, context)) {
		ret = eglGetError();
		EPRINTF("eglMakeCurrent failed: 0x%08X\n", ret);
		goto err;

	printf("GL_VERSION: %s\n", glGetString(GL_VERSION));
	printf("GL_RENDERER: %s\n", glGetString(GL_RENDERER));

Unfortunately, this code doesn't work, eglGetDisplay() just returns NULL, we may also see in the system's log:

[Client Pid=114]nread errno 3
nread Client Header Read: No such process
[PIG]C-pglInitializeLibrary:229 - Failed to initialize platform layer!
[PIG]C-pglDisplayCreate:102 - Failed to initialize library
[PIG]E-eglGetDisplay:684 - Out of memory!

With a different config it throwns a bit different error:

[Client Pid=116]nwrite errno 32
nwrite Client HeaderWrite: Broken pipe
[PIG]C-pglInitializeLibrary:229 - Failed to initialize platform layer!
[PIG]C-pglDisplayCreate:102 - Failed to initialize library
[PIG]E-eglGetDisplay:684 - Out of memory!

I suppose it's because OpenGL ES could be used (not sure when this restriction was added) for system applications only. We could try to change SELF auth info in process's structure, thus telling system that our application have system privileges too. For example, we could try to set PAID field to Shellcore's PAID (0x3800000000000010) or Web browser's UI process (0x3800000000000035). And now it failed at eglCreateContext() and we got a different error:

[GnmCompositor]W:\Build\J02015311\sys\internal\usermode\src\compositor\common\memory_util.cpp(182): in allocate() Allocate failed ret=-2147352541
sceCompositorInit failure SystemShared 0x00800000 VideoShared 0x0e400000 VideoPrivate 0x00000000 ProcessOrder 0[PIG]C-pglInitializeLibrary:229 - Failed to initialize platform layer!
[PIG]C-pglDisplayCreate:102 - Failed to initialize library
[PIG]E-eglGetDisplay:684 - Out of memory!

For some reason, a compositor process can't allocate video memory for us. I've spent some time to understand what's the reason of such behaviour but I will discuss it a bit later. Let's try to do the same using a custom application and not the browser.

Moving to fPKG

I want to create an executable with system privileges (to be able to use OpenGL ES without extra hacks), to do that I need to set proper SELF auth info in eboot.fself (first 8 bytes here won't be used actually):

00 00 00 00 00 00 00 00 00 00 00 00 00 1C 00 40
00 FF 00 00 00 00 00 80 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 80 00 40 00 40
00 00 00 00 00 00 00 80 00 00 00 00 00 00 00 08
00 40 FF FF 00 00 00 F0 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00

To do that I may use script which could set auth info parameter and load compiled .elf file and generate .fself file, then we put the resulting file in .gp4 project and use fPKG generator to build final .pkg file. Also because I'm using Linux to build custom applications (with Wine), I need to use Makefile to compile, link and build final package file, but it's not a subject of this write-up, so I wan't talk about it here, if you want to see it then see links at the bottom.

Also because we're making a system application, we don't need to include .prx modules from SDK (libc.prx and libSceFios2.prx, in particular), we'll use functions from libSceLibcInternal.sprx module which is located in the firmware itself. And to do that we need to have a static library for it to be able to link our application properly, but there is no such library in SDK, so we need to create it by ourselves. I just ran grep to extract all exported symbols from libc.prx from SDK and then created a text file with symbols' list for stub maker. Also if we use original CRT we need to declare a dummy function that prevents crashing at exiting due to unresolved symbol:

void catchReturnFromMain(int exit_code) {

System applications does linking by calling sceSysmoduleLoadModuleInternal() or sceSysmoduleLoadModuleByNameInternal(), all possible IDs could be taken from the list in the decrypted libSceSysmodule.sprx. I just need to link libSceSystemService.sprx here.

Then we just need to load libScePigletv2VSH.sprx module, and we're getting eglGetDisplay failed. again. :'(

sceCompositorInit failure SystemShared 0x00200000 VideoShared 0x02400000 VideoPrivate 0x00000000# ProcessOrder 1[PIG]C-pglInitializeLibrary:229 - Failed to initialize platform layer!
[PIG]C-pglDisplayCreate:102 - Failed to initialize library
[PIG]E-eglGetDisplay:684 - Out of memory!

Fixing video memory issue

As I said above too, for some reason it can't allocate video memory, so I've started to dig around. This error string could be found in system/common/lib/libSceCompositeExt.sprx or system/priv/lib/libSceComposite.sprx and if you'll look around then you should see a function that send a request to the GnmCompositor.elf process using sockets. It seems that this IPC request throwns an error for some reason and to be able to determine the reason of it we need to look into the compositor process itself: system/sys/GnmCompositor.elf.

After spending some time using live debugger I've figured out that the failed function is allocate() from memory_util.cpp it allocates direct memory and set up memory region's protection bits, all of that is parametrized by two parameters - ptype and memory type, which are set by parent function SetupClientListener(). There we may find a call to some libkernel's function that returns memory budget type. So, we have some direction now, need to find where is memory budget type set. Because almost all applications are started by SceShellCore process, we need to look there.

Basically there is a function (actually there are a few of them but nevermind) that launch a new application, it's LncManager::launchApp(), you could find it by locating string references, inside it you should see a call to LncApplication::initializeBudget() and then to LncBudget::initialize(), the latter does exactly what we need. It's easy to notice that this function sets our types based on application's category which may take one of such values: gd, gde, gda, gde, gdc, gdf, gdd, gdg, gdk, gdm. If you ever tried to build a package file you should know that this value means CATEGORY field from param.sfo for game data projects. It's known that gd is actual game data but what other categories means? Let's try grep against decompiled C# code:

private bool IsCategory(string Category)
	bool result = true;
		if (Category.Equals("gdf") || Category.Equals("gdg") || Category.Equals("gdd") || Category.Equals("gda") || Category.Equals("gdh") || Category.Equals("gdj") || Category.Equals("gdm") || Category.Equals("gdp"))
			Console.WriteLine("[CRASHREPORT_UI] : Is System Category");
		else if (Category.Equals("gd") || Category.Equals("gp") || Category.Equals("gde") || Category.Equals("gpe") || Category.Equals("gdn") || Category.Equals("gpn") || Category.Equals("gdc") || Category.Equals("gpc") || Category.Equals("gdk") || Category.Equals("gpk") || Category.Equals("gdo") || Category.Equals("gpo"))
			Console.WriteLine("[CRASHREPORT_UI] : Is Application Category");
			result = false;
			Console.WriteLine("[CRASHREPORT_UI] : Is Another Category");
	catch (Exception ex)
		CrashReportLogger.iLog(base.GetType().get_Name(), "IsCategory() Exception Error....");
		CrashReportLogger.eLog(base.GetType().get_Name(), ex);
	return result;

This code tells us that some of them are made for system applications, others are for different application's categories. Let's try to specify a different application's category in .sfx (or .sfo) file, for example, gde.

[Error]	Format of the param file is not valid. (sce_sys/param.sfo, unexpected Category)

Haha, have you really believed that it will work? Fortunately, we have a solution for this already: some time ago I've posted patches for publishing tools that allows creating of PS2 packages for PS4 and people did make a GUI frontends for it, so just use them.

After building a package file let's try to install and run it, and, ..., bingo!

EGL version major:1, minor:4
GL_VERSION: OpenGL ES 2.0 Piglet

We have OpenGL ES 2.0 with EGL 1.4. Just need to set CATEGORY to gde.

Resolving runtime shader compilation issue

But if we're trying to compile shaders then system tells us that we can't do that because there is no available runtime shader compiler:

[PIG]I-_OrbisCheckSystemRuntimeCompile:63 - Runtime shader compilation disabled in Target Manager
[PIG]E-pglPlatformShaderCompile:221 - Shader compiler not supported!
[PIG]E-glCompileShader:1301 - Failed to compile shader!

I've figured that it's because on retail consoles a shader compiler code in libScePigletv2VSH.sprx and module that does compilation (libSceShaccVSH.sprx) were removed entirely. Of course, you could try to use some thirdparty compiler, etc...

But... what if we'll use libScePigletv2VSH.sprx and libSceShaccVSH.sprx from devkit's .pup? Let's grab them from there, I see that 4.74 .pup contains them, we need to decrypted them into .elfs and place them into sce_module/ directory of our application and then load them manually instead of loading them from sandboxed directory. Also we need to make a few small patches to Piglet module to use our shader compiler module:

/* XXX: patches below are given for Piglet module from 4.74 Devkit PUP */
static void pgl_patches_cb(void* arg, uint8_t* base, uint64_t size) {
	/* Patch runtime compiler check */
	const uint8_t p_set_eax_to_1[] = {
		0x31, 0xC0, 0xFF, 0xC0, 0x90,
	memcpy(base + 0x5451F, p_set_eax_to_1, sizeof(p_set_eax_to_1));

	/* Tell that runtime compiler exists */
	*(uint8_t*)(base + 0xB2DEC) = 0;
	*(uint8_t*)(base + 0xB2DED) = 0;
	*(uint8_t*)(base + 0xB2DEE) = 1;
	*(uint8_t*)(base + 0xB2E21) = 1;

	/* Inform Piglet that we have shader compiler module loaded */
	*(int32_t*)(base + 0xB2E24) = s_shcomp_module;

Here s_shcomp_module is an ID of the compiler's module that we have loaded from sce_module/ directory manually. Let's test it:

compiling vertex shader...
compiling fragment shader...


Also, if you use HEN that have this patch for sceSblAuthMgrIsLoadable() -> sceSblACMgrGetPathId(), you could load both modules from /data/self/system/common/lib directory. In that case user could download fSELFs of these two modules and place them to that folder (could use some tool to automate that).

Final words

So, to use OpenGL ES we need:

You could find an example code that renders three colored rectangles using GLSL here: