How to Upload Custom Gaming Maps to File Sharing Services
The following sections incorporate descriptions of the concepts that are primal to understanding and using the Maps SDK for Unity.
The MapsService class and component
The MapsService
class serves as the entry point for interacting with the Maps SDK for Unity. It encapsulates the ApiKey, and it exposes the GameObjectManager, and the LoadMap function, besides as Events from the GameObject Creation pipeline.
To use the Maps SDK for Unity in your Unity projection, you lot add the Map Service script component to an empty GameObject
in your scene. The Map Service automatically adds generated map feature GameObjects as children of this anchor GameObject. With the Maps Service (Script) fastened to your base of operations GameObject, you can access its public attributes in the Unity Inspector—as shown here.
Geographic features as Unity GameObjects
The Maps SDK for Unity renders geographic features (such as building, roads, and water) from the Google Maps database, as Unity GameObjects in games. At runtime, they're created every bit children of the GameObject that the MapsService component is fastened to, and they accept names of the form {MapFeatureType} ({PlaceID}).
GameObject creation
During gameplay, the SDK pulls-down geo data from the Google Maps database—as semantic vector tiles (via the Semantic Tile API). It decodes this information on-the-fly, transforming information technology into Unity GameObjects. This approach allows you to access map feature data (both the metadata, and the geometry data) at the earliest opportunity, and so you tin customize the GameObjects before they reach the end of the pipeline.
The showtime thing that the Maps SDK for Unity does when it receives vector data, is construct a MapFeature
object out of it.
At an intermediate stage in the pipeline, MapFeature
objects are specialized. That is, they go specific types (for example, a Google.Maps.Feature.ModeledStructure
). These specialized MapFeature
objects contain the MapFeatureShape
geometry details ( ModeledVolume
in the case of a ModeledStructure
). These details include both MapFeature-specific data (such as vertices and triangles), and shared interfaces for accessing common fields (such as bounding boxes).
Geometry data is converted into a Unity Mesh, and is added to the spawned GameObject via MeshFilter
, and then is displayed with a MeshRenderer
.
Accessing the pipeline
MapFeature
due south are exposed to you through events that are triggered during diverse stages of the pipeline. These include WillCreate
events—which are fired just before the GameObject
is created, allowing y'all to specify the styling options, or fifty-fifty cancel cosmos; and DidCreate
events—fired just afterward the GameObject
is created, allowing you to make additions or changes to the finished mesh.
As an example, you could examine ExtrudedStructures
after their WillCreateExtrudedStructureEvent
fires, and hibernate all buildings shorter than 20 meters (or you could skip creating them altogether).
Types of events
The Google.Maps.Effect
namespace contains an event class for each blazon of geographic features.
- Segments Events
- Regions Events
- Line Water Events
- Area Water Events
- Extruded Structures Events
- Modeled Structures Events
Each of these event types has a WillCreate
and a DidCreate
public member event object that yous can subscribe to, as demonstrated in the following lawmaking case.
dynamicMapsService.MapsService.Events.ExtrudedStructureEvents.DidCreate.AddListener(args => { // Apply nine-sliced walls and roof materials to this building. buildingTexturer.AssignNineSlicedMaterials(args.GameObject); // Add together a border around the building base using the Building Border Builder class, // coloring it using the given border Fabric. Extruder.AddBuildingBorder(args.GameObject, args.MapFeature.Shape, BuildingAndRoadBorder); });
WillCreate
events
WillCreate
events are fired immediately after a MapFeature
is created, but before the final GameObject is generated from it. WillCreate
events allow you to suppress or customize the GameObjects created from a MapFeature
. WillCreate
event arguments accept the following form:
using System.ComponentModel; using Google.Maps.Decoded; using UnityEngine; namespace Google.Maps { public form WillCreateGameObjectEventArgs<T, U> : CancelEventArgs where T : IMapObject, U : IGameObjectStyle { public readonly T MapObject; public U Style; public GameObject Prefab; Public WillCreateGameObjectEventArgs(T mapObject, U defaultStyle, GameObject prefab) { MapObject = mapObject; Style = defaultStyle; Prefab = prefab; } } }
- Setting
Cancel
(inherited fromCancelEventArgs
) to truthful suppresses the cosmos of the GameObject. -
MapObject
is readonly. - Setting
Style
allows y'all to customize the appearance of the created GameObject. - Setting
Prefab
replaces the GameObject that would have been generated, with the prefab.
DidCreate
events
DidCreate
events are fired later a GameObject is generated, after information technology has been added to the scene. They notify you when the creation of the GameObject was successful, allowing y'all to perform further processing. DidCreate
issue arguments have the post-obit course:
using System.ComponentModel; using Google.Maps.Decoded; using UnityEngine; namespace Google.Maps { public grade DidCreateGameObjectEventArgs<T, U> : EventArgs where T : IMapObject, U : IGameObjectStyle { public readonly T MapObject; public GameObject CreatedObject; Public DidCreateGameObjectEventArgs(T mapObject, GameObject createdObject) { MapObject = mapObject; CreatedObject = createdObject; } } }
-
MapObject
is readonly - so mutating information technology will non crusade any change to the scene. - Altering
CreatedObject
will change the GameObject added to the scene.
Buildings
There are two types of buildings: extruded buildings, and modeled structures.
Extruded buildings
Extruded buildings are generated from an outline (that is, a 2D footprint) and a tiptop. The SDK represents most buildings in this way, and it generates them in the following three ways:
-
Using real-world height data (where this information is available). This is the default behavior.
-
By providing a stock-still height for all edifice, disregarding their real-world height.
-
By providing a backup elevation for all buildings that don't have a real-world height (by default, this value is ready to 10 meters).
Combining these iii methods allows the Maps SDK for Unity to create cityscapes with realistic variance reflecting the real world, or with a constant building height, or a mixture of the two.
Modeled structures
Modeled structures are generated using the standard 3D modeling approach of tessellated triangles. This arroyo is typically used for landmark buildings, such equally the Statue of Liberty.
Applying materials
In Unity, the rendering process uses shaders, materials, and textures, to add realism to GameObjects. Shaders define how textures, colors, and lighting, are applied to displayed geometry, with the specific textures, colors and other settings used stored as a material. You use materials to ascertain how a surface is rendered—past including references to the textures information technology uses, to tiling information, and to color.
Shaders are small scripts that contain the logic for calculating the color of each pixel—based on the lighting input, and on the cloth configuration. The Maps SDK for Unity comes with a standard shader for modeled structures, and another for basemap features, but it also supports advanced material application. Coordinates for UV mapping are calculated for map feature GameObjects in such a mode that whatever basic material can be applied, and it will await reasonable without modification.
For more advanced material effects, the Maps SDK for Unity provides additional data per-vertex via extra UV channels, every bit well as a number of convenience functions for cg shaders via the GoogleMapsShaderLib
library. This allows things like Nine-Sliced building textures—cutting-up a texture into the roof, ground, wall-corners, and tiled walls, for a edifice.
For more information, come across Creating and Using Materials in the Unity User Transmission.
UV channels
The UV channels for each MapFeature
type contain data of the following form:
ExtrudedStructure
Walls
Each wall on an ExtrudedStructure
is constructed as a quad of the following grade:
UV coordinates for walls are calculated per quad. Vertices are non shared between quads—to allow for hard normals betwixt walls (that is, letting the corners of walls appear as difficult angles, rather than soft rounded edges).
- Channel 0: (x, y, width, acme)
- x and y are the coordinates relative to the bottom left corner of this quad (foursquare section) of the wall, whereas width and height are the width and tiptop of this quad of the wall. This applies to every quad making up the wall.
Roof
Roof textures take the selection of being either axis-aligned, or aligned to the direction of the ExtrudedStructure
. You set this via the ExtrudedStructureStyle
object.
- Aqueduct 0: (10, y, width, acme)
- x and y are the coordinates of each vertex, relative to the bottom-left corner of the roof (specifically, the corner of the minimum-cover centrality-aligned bounding box for the roof). width and height ascertain the size of the roof's bounding box.
Region
- Channel 0: (x, y, width, acme)
- x and y are the coordinates of each vertex relative to the bottom-left corner of the axis-aligned bounding box for the region. width and height define the size of the bounding box.
Segment
- Channel 0: (10, y, width, length)
- x and y are the coordinates of each vertex, calculated as if the segment were completely straight—to permit texturing to bend around corners. width and length define the dimensions of the segment.
ModeledStructure
- Channel 0:
- Each coordinate is ready to (0, 0, 0, 0) considering there is currently no texture-coordinate implementation.
GoogleMapsShaderLib
The Maps SDK for Unity includes a shader library called GoogleMapsShaderLib, to assist you build shaders that work well with MapFeature GameObjects. The library is implemented in the file GoogleMapsShaderLib.cginc. Yous tin can utilise the library by including the following #include
directive within the CGPROGRAM
flags department in your shader script.
CGPROGRAM // ... #include "/Assets/GoogleMaps/Materials/GoogleMapsShaderLib.cginc" // ... ENDCG
The shader library is bundled inside the GoogleMaps.unitypackage. Later you import the package, you can notice GoogleMapsShaderLib.cginc inside the project folder /Assets/GoogleMaps/Materials/.
Nine-slicing
GoogleMapsShaderLib includes a convenience function that you tin can apply in fragment shaders to provide nine-slicing of textures. Nine-slicing is a technique for roofing surfaces with a texture, where the texture is divided into 9 portions using a series of bounds. Areas between the bounds are tiled, and areas outside the premises remain stock-still—every bit illustrated hither:
For example, when applying a ix-sliced texture to a building'south wall, the top of the texture is applied to the acme of the wall (merely under the roof), the lesser of the texture is applied to the bottom of the wall (connected to the ground), the sides of the texture are practical to the edges of the wall, and the area in the middle of the texture is tiled evenly across the wall.
On roads (for another example), nine-slicing allows you to accept a sidewalk of stock-still width, merely with a variable number of lanes, depending on the width of the road.
You tin use ix-slicing by including GoogleMapsShaderLib.cginc in your shader, then calling the nineSlice
function. Sample shaders and materials are included in the GoogleMaps.unitypackage to demonstrate how yous can apply the nineSlice
function to create a realistic skyscraper of variable size—without stretching or tearing.
Example Materials location
/Assets/GoogleMaps/Examples/04_Advanced/MoreStyling/Materials/NineSlicing
Instance Shader location
/Assets/GoogleMaps/Examples/04_Advanced/MoreStyling/Materials/NineSlicing/BuildingWall.shader
Yous tin can use nine-slicing on any MapFeature
, except for ModeledStructures
, which don't currently accept whatsoever texturing coordinates.
The coordinate system
The Maps SDK for Unity coordinate organization uses the Spider web Mercator Projection to convert betwixt spherical WGS 84 latitude-longitude and cartesian Unity Worldspace (Vector3).
Vector3 values are relative to a floating origin, which is typically ready to the user's starting location. Equally a result, yous should non persist Vector3 values outside of a session (that is, on your server, or on the user'due south device). We recommend that you specify physical world locations using latitude-longitude pairs.
A floating origin is used to avoid floating-point stability issues. Unity'south Vector3 grade uses single-precision floating-point numbers, and the density of representable floating-point numbers decreases every bit their magnitude increases (pregnant larger floating-point numbers become less accurate). You can update the floating origin whenever users movement far enough abroad from the origin that this becomes an issue. Yous can ready this to a relatively small value (for example, 100 or 200 meters), or larger (greater than 1 km) depending on how ofttimes you want to update things.
Unity Worldspace is scaled to 1:i (meters), based on the initial origin'southward latitude. In the Mercator Projection, scale varies slightly by latitude, and so the Unity Wordspace scale diverges marginally from 1:one as users move North and Due south; however, users are non expected to move far (or fast) enough for this to be noticeable.
The Maps SDK for Unity contains conversion functions for converting between Google.Maps.LatLng
and Unity Worldspace (Vector3)—that take into account floating origin and calibration.
Load errors
Errors that occur while loading map data from the network can exist handled with the MapLoadErrorEvent
effect. The Maps SDK for Unity handles most types of errors itself if you don't add an outcome handler. However, at that place is an error that requires that your app take some action. This is specified by MapLoadErrorArgs.DetailedErrorCode
and is described below.
UnsupportedClientVersion
This version of the Maps SDK for Unity is no longer supported, possibly in combination with the electric current API Key. Typically your app should prompt the user to update to a newer version of your app.
This mistake usually means that the Maps SDK for Unity version is too old. In rare cases, we might use this if we discover a critical problem with a version of the Maps SDK for Unity or with an API Key. We volition brand every effort to communicate this and ensure that this doesn't happen until in that location is a working version of the app bachelor to update to.
It's best practise to ensure your app has suitable upgrade path in the event this error occurs, allowing your users to migrate to a newer version of your app with a supported SDK version. For more than data, see the Client Kill Switch documentation.
Known bug
Base map features are rendered back-to-front without z-testing because all features of this type are rendered in the same airplane. You must ready z-testing to ZTest Always on any replacement materials that you utilise to base map features to ensure that they are rendered correctly.
Google has included a sample shader that addresses these issues—in the GoogleMaps.unitypackage. Information technology's called "BaseMapTextured.shader
, and it'southward located in the /Avails/GoogleMaps/Materials/
folder. To use it on a material, select Google > Maps > Shaders > BaseMap Textured from the shader drop-down in the material inspector.
When styling a Feature.Region
or a Feature.AreaWater
object, you tin apply a fill up using either a material, a custom color, or an automatically generated color called via the FillModeType
enum inside the styling object. Automobile
colors are generated based on the value of the Region's usage type.
Source: https://developers.google.com/maps/documentation/gaming/concepts_musk
0 Response to "How to Upload Custom Gaming Maps to File Sharing Services"
Post a Comment