Force .NET webservice to use local object class, not proxy class
Asked Answered
G

4

7

I have a webservice that I'm calling from a windows forms application (both .NET, both in the same solution), and I'd like my webservice to return a custom object from elsewhere in the project - it's a common object that they both share a reference to, as it's in the third project in my solution. When I call the webservice, it returns a "Person" object, but it's in the namespace of the webservice, and it's created from a proxy class that the webservice itself generated. As such, I can't manipulate it and return it to my program, which is expecting a "Person" object based on the shared copy of the class, not a proxy copy from the webservice namespace, and I get an error when I try to CType it to the correct class type.

How do I force the webservice to use a local copy of the class, not a proxy copy? Does my question make any sense in this context? If not, I'll clarify it.

Of note - I've resorted to passing all of the parameters ByRef, and using those returned values to populate a copy of the object I create upon return. That can't be the best way to do this!

Genitourinary answered 19/10, 2008 at 19:55 Comment(2)
I've asked this question before.Carbonize
Were you using asmx or WCF?Atkins
A
2

By design, the proxy object (or class) generated on the client side is distinct, although identical, to the object used by the corresponding web-method on server side.

For example, when the server side serializes & returns an object A, it will be received & deserialized on the client end within a generated corresponding object B having an identical structure to the object A (same member, same sub-classes, etc), but within a different namespace specific to the client-side web-service's reference and likely with a different name. There are good reasons for that, but there is no point discussing it here.

As pointed out earlier in the thread, there might be a lot of useful code developed around object A, and it is a pity to re-implement it (or simply duplicate source code). Besides we want to avoid code duplication as much as possible to facilitate further maintenance.

The solution of fiddling within the auto-generated de-serialization of the client-stub (soap client) is even more risky on the long run because it requires you to apply same manipulations for each subsequent client-stubs re-synchronizations with each new versions of the server-side. Besides, I understand that it was, at some point, possible in 2008, but what guaranties do we have that it will continue to work the same way with subsequent Visual Studio versions? I couldn't do it within VS2019.

There is a simple (and safe) solution for lazy (and cautious) people like me; here is how:

After the client-stub receives the object B (corresponding to object A of server side), it is possible to swiftly "transform" it back into an object A. You just need to do a "Deep-Copy" from object A to object B! If objects A (and B) are complex enough you might consider a generic Deep-Copy using "System.Reflection" to match each sub-fields by their name OR you can use a generic Serializer/Deserializer to Convert the received object B to text, then convert this text back into an object A. For instance, I did the job in few lines of code using Newtonsoft Nuget package.

See the below example, in the .asmx services' page on server side:

    [WebMethod]
    public ObjectTypeA WebMethodPeek()
    {
        ObjectTypeA instanceA = GetSerializableObjectA();
        return instanceA;
    }

While of the web-service client side:

    using Newtonsoft.Json;

    public ObjectTypeA Peek()
    {
        SoapClient cli = new SoapClient("my-end-point-name");
        ObjectTypeB instanceB = cli.WebMethodPeek();
        string text = JsonConvert.SerializeObject(instanceB);
        ObjectTypeA instanceA = JsonConvert.DeserializeObject<ObjectTypeA>(text);
        return instanceA ;
    }

Thanks to Enrique Reyes for the simple and elegant "DeepCopy" answer he provided in this post How to deep copy between objects of different types in C#.NET)

After writing this answer, I found the same strategy (...somehow and without much emphasis) also explained in a different way in this post Serializing/deserializing System.Object using a different type

Addemdum

After some hands-on work I found few surmountable annoyances with this method of using this "JSon Serialization/Deserialization" to do a deep copy of a identical (or almost) data-structures.

  1. Expect a larger foot print for large objects with many sub-classes and properties. When adding a web-service reference on client side, not only it replicates the structure hierarchy (with different namespaces) for web-methods' arguments, but it replicates another one if used as a return type. Counting for the original one (which is shared via common project library) we end up with 3 copies of same large serializable structure.

  2. Then I encountered an obstacle with polymorphic sub-classes which confuses the deserializer. That's because, in my case, those are loosly typed as "object" in order to hold different derived classes disregarding their types. In such case we still can write a set of callback functions to do the patch work in a class derived from JsonConverter. Nothing absolutly needs to be done on server side.

But overall everything worked out despite the large size and complexity of the exchanged objects. I still use xsd.exe to generate fully serializable class for larger, and more complex, objects.

Another strategy yet simpler: using binary serialization

Afterwards I found much simpler to exchange binary objects (i.e. byte[]) involving only .NET Framework. It allows for using the same class shared through a common library between the server side and client side. On either side we need to serialize before sending and deserialize what we receive.

I made up some very simplified examples omitting whatever is not necessary for understanding.

  • Required namespaces:

     using System.IO;
     using System.Runtime.Serialization.Formatters.Binary;
    
  • Example client receiving from server:

     SharedClass myClass = null;
     byte[] byteArray = my_WebService.GetInfo();
     BinaryFormatter formatter = new BinaryFormatter();
     using (MemoryStream stream = new MemoryStream(byteArray)) {
         object obj = formatter.Deserialize(stream);
         myClass = obj as SharedClass ;
     }
    
  • Example client sending to server:

     byte[] byteArray = null;
     SharedClass myClass = new SharedClass(xyz);
     BinaryFormatter formatter = new BinaryFormatter();
     using (MemoryStream stream = new MemoryStream()) {
         formatter.Serialize(stream, myClass);
         byteArray = stream.ToArray();
     }
     my_WebService.SendInfo(byteArray);
    

On server side it's pretty much the same thing

  • server sending to client:

     [WebMethod]
     public byte[] GetInfo()
     {    
         SharedClass myClass = new SharedClass(xyz);
         BinaryFormatter formatter = new BinaryFormatter();
         using (MemoryStream stream = new MemoryStream()) {
             formatter.Serialize(stream, myClass);
             return (stream.ToArray());
         }
     }
    
  • server receiving from client:

     [WebMethod]
     public void SendInfo(byte[] byteArray)
     {    
         SharedClass myClass = null;
         BinaryFormatter formatter = new BinaryFormatter();
         using (MemoryStream stream = new MemoryStream(byteArray)) {
             object obj = formatter.Deserialize(stream);
             myClass = obj as SharedClass;
         }
     }
    

This works swiftly without much work. But:

  • Ensure that the shared structure is always the same version (or more precisely: just the same), otherwise the deserialization will break
  • If the deserialization does not break. Still do minimum data integrity validations even when receiving data from trusted users. Look for "out of range" values, corrupted dates, etc
  • Security wise. Many are not comfortable (often freaking out) exchanging data through web-services in binary for security reasons. It always depends of what you do with the received data. If you just save the binary blog as is, it's worrying indeed. But it might be completely irrelevant if it's structured data, that is decomposed in smaller data elements and the re-assembled into a structured object. Thus, illegitimate data is very likely to fail even basic types validation. In the worst case you ended up with some corrupted "valid" data, which you are getting anyway from trusted users with good intentions.
  • Having said that, Web-Services should have a minimum access control: firewall, ip-filtering, username/password, etc
Augean answered 16/2 at 16:38 Comment(1)
Wow you’re making get in the wayback machine for this project. I suspect it’s solved in newer versions of .NET (I think this was in v2) but I really like this approach of doing the reflector copy - it achieves what I wanted (a local copy that’s not cumbersome in naming and supports some additional features) without editing anything by hand. Congrats on the late accept!Genitourinary
A
3

If you are using svcutil.exe to generate a WCF client proxy, you can use /reference on the command-line to specify the assembly containing the common class. Svcutil should reuse that class definition instead of generating new one in the service proxy namespace.

Also, this will work only if your common class is serializable and passed by value (i.e. it's exposed as a data contract, not as a service contract).

Altorelievo answered 19/10, 2008 at 20:26 Comment(1)
Is this different than adding a reference through the project itself? I've got my common class listed in the "References" section. I suppose I'll try this - awesome it if it works, and points to you!Genitourinary
S
2

If you are using WCF is it fairly easy to use the same data contracts and service interface between the client and the consumer. You can either compile in the generated proxy class and modify it to use the correct namespaces or use the ChannelFactory class to create a dynamic proxy for you.

The first solution is very brittle and will cause you to modify the proxy class every time the service interface changes. The second technique works fairly well and we used in during a previous project I worked on. With either of these methods you need to make sure all your callers continue are up to date with the latest version of the interface.

From the way you are describing the problem it sounds like you want the service and the client to share the same instance. Since WCF serializes and deserializes your types as you send them to and from the service you would have to do something a bit more clever. Is this what you meant?

Sauncho answered 19/10, 2008 at 20:32 Comment(0)
A
2

By design, the proxy object (or class) generated on the client side is distinct, although identical, to the object used by the corresponding web-method on server side.

For example, when the server side serializes & returns an object A, it will be received & deserialized on the client end within a generated corresponding object B having an identical structure to the object A (same member, same sub-classes, etc), but within a different namespace specific to the client-side web-service's reference and likely with a different name. There are good reasons for that, but there is no point discussing it here.

As pointed out earlier in the thread, there might be a lot of useful code developed around object A, and it is a pity to re-implement it (or simply duplicate source code). Besides we want to avoid code duplication as much as possible to facilitate further maintenance.

The solution of fiddling within the auto-generated de-serialization of the client-stub (soap client) is even more risky on the long run because it requires you to apply same manipulations for each subsequent client-stubs re-synchronizations with each new versions of the server-side. Besides, I understand that it was, at some point, possible in 2008, but what guaranties do we have that it will continue to work the same way with subsequent Visual Studio versions? I couldn't do it within VS2019.

There is a simple (and safe) solution for lazy (and cautious) people like me; here is how:

After the client-stub receives the object B (corresponding to object A of server side), it is possible to swiftly "transform" it back into an object A. You just need to do a "Deep-Copy" from object A to object B! If objects A (and B) are complex enough you might consider a generic Deep-Copy using "System.Reflection" to match each sub-fields by their name OR you can use a generic Serializer/Deserializer to Convert the received object B to text, then convert this text back into an object A. For instance, I did the job in few lines of code using Newtonsoft Nuget package.

See the below example, in the .asmx services' page on server side:

    [WebMethod]
    public ObjectTypeA WebMethodPeek()
    {
        ObjectTypeA instanceA = GetSerializableObjectA();
        return instanceA;
    }

While of the web-service client side:

    using Newtonsoft.Json;

    public ObjectTypeA Peek()
    {
        SoapClient cli = new SoapClient("my-end-point-name");
        ObjectTypeB instanceB = cli.WebMethodPeek();
        string text = JsonConvert.SerializeObject(instanceB);
        ObjectTypeA instanceA = JsonConvert.DeserializeObject<ObjectTypeA>(text);
        return instanceA ;
    }

Thanks to Enrique Reyes for the simple and elegant "DeepCopy" answer he provided in this post How to deep copy between objects of different types in C#.NET)

After writing this answer, I found the same strategy (...somehow and without much emphasis) also explained in a different way in this post Serializing/deserializing System.Object using a different type

Addemdum

After some hands-on work I found few surmountable annoyances with this method of using this "JSon Serialization/Deserialization" to do a deep copy of a identical (or almost) data-structures.

  1. Expect a larger foot print for large objects with many sub-classes and properties. When adding a web-service reference on client side, not only it replicates the structure hierarchy (with different namespaces) for web-methods' arguments, but it replicates another one if used as a return type. Counting for the original one (which is shared via common project library) we end up with 3 copies of same large serializable structure.

  2. Then I encountered an obstacle with polymorphic sub-classes which confuses the deserializer. That's because, in my case, those are loosly typed as "object" in order to hold different derived classes disregarding their types. In such case we still can write a set of callback functions to do the patch work in a class derived from JsonConverter. Nothing absolutly needs to be done on server side.

But overall everything worked out despite the large size and complexity of the exchanged objects. I still use xsd.exe to generate fully serializable class for larger, and more complex, objects.

Another strategy yet simpler: using binary serialization

Afterwards I found much simpler to exchange binary objects (i.e. byte[]) involving only .NET Framework. It allows for using the same class shared through a common library between the server side and client side. On either side we need to serialize before sending and deserialize what we receive.

I made up some very simplified examples omitting whatever is not necessary for understanding.

  • Required namespaces:

     using System.IO;
     using System.Runtime.Serialization.Formatters.Binary;
    
  • Example client receiving from server:

     SharedClass myClass = null;
     byte[] byteArray = my_WebService.GetInfo();
     BinaryFormatter formatter = new BinaryFormatter();
     using (MemoryStream stream = new MemoryStream(byteArray)) {
         object obj = formatter.Deserialize(stream);
         myClass = obj as SharedClass ;
     }
    
  • Example client sending to server:

     byte[] byteArray = null;
     SharedClass myClass = new SharedClass(xyz);
     BinaryFormatter formatter = new BinaryFormatter();
     using (MemoryStream stream = new MemoryStream()) {
         formatter.Serialize(stream, myClass);
         byteArray = stream.ToArray();
     }
     my_WebService.SendInfo(byteArray);
    

On server side it's pretty much the same thing

  • server sending to client:

     [WebMethod]
     public byte[] GetInfo()
     {    
         SharedClass myClass = new SharedClass(xyz);
         BinaryFormatter formatter = new BinaryFormatter();
         using (MemoryStream stream = new MemoryStream()) {
             formatter.Serialize(stream, myClass);
             return (stream.ToArray());
         }
     }
    
  • server receiving from client:

     [WebMethod]
     public void SendInfo(byte[] byteArray)
     {    
         SharedClass myClass = null;
         BinaryFormatter formatter = new BinaryFormatter();
         using (MemoryStream stream = new MemoryStream(byteArray)) {
             object obj = formatter.Deserialize(stream);
             myClass = obj as SharedClass;
         }
     }
    

This works swiftly without much work. But:

  • Ensure that the shared structure is always the same version (or more precisely: just the same), otherwise the deserialization will break
  • If the deserialization does not break. Still do minimum data integrity validations even when receiving data from trusted users. Look for "out of range" values, corrupted dates, etc
  • Security wise. Many are not comfortable (often freaking out) exchanging data through web-services in binary for security reasons. It always depends of what you do with the received data. If you just save the binary blog as is, it's worrying indeed. But it might be completely irrelevant if it's structured data, that is decomposed in smaller data elements and the re-assembled into a structured object. Thus, illegitimate data is very likely to fail even basic types validation. In the worst case you ended up with some corrupted "valid" data, which you are getting anyway from trusted users with good intentions.
  • Having said that, Web-Services should have a minimum access control: firewall, ip-filtering, username/password, etc
Augean answered 16/2 at 16:38 Comment(1)
Wow you’re making get in the wayback machine for this project. I suspect it’s solved in newer versions of .NET (I think this was in v2) but I really like this approach of doing the reflector copy - it achieves what I wanted (a local copy that’s not cumbersome in naming and supports some additional features) without editing anything by hand. Congrats on the late accept!Genitourinary
W
0

I am not sure but when you compile a .NET web service it will create a DLL file which you can try using that for the local. But when I am building service oriented applications, I create different layers within my solution for e.g Data Access Layer, Logic Layer, Service Layer, UI Layer, Controller Layer, and for example in the Controller Layer I will do a user authentication method which is connected with the Data Access Layer and Logic Layer and then I will call that method on service layer and I can also call it on the UI layer and if I call it from within the UI layer it is called locally, when I want to use it from the service layer, I will create a web method using that method which will return a bool or username etc.

Woozy answered 19/10, 2008 at 20:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.